aboutsummaryrefslogtreecommitdiff
path: root/run
Commit message (Collapse)AuthorAgeFilesLines
* Add Apache license headers and LICENSE and NOTICE filesMatei Zaharia2013-07-161-0/+17
|
* Merge branch 'master' into 2013-06/assembly-jar-deployEvan Chan2013-06-281-33/+71
|\ | | | | | | | | | | | | | | Conflicts: run Previous changes that I made to run and set-dev-classpath.sh instead have been folded into compute-classpath.sh
| * Look at JAVA_HOME before PATH to determine Java executableMatei Zaharia2013-06-271-4/+5
| |
| * Fix computation of classpath when we launch java directlyMatei Zaharia2013-06-251-57/+10
| | | | | | | | | | | | | | | | | | The previous version assumed that a CLASSPATH environment variable was set by the "run" script when launching the process that starts the ExecutorRunner, but unfortunately this is not true in tests. Instead, we factor the classpath calculation into an extenral script and call that. NOTE: This includes a Windows version but hasn't yet been tested there.
| * Merge remote-tracking branch 'cgrothaus/SPARK-698'Matei Zaharia2013-06-251-7/+11
| |\ | | | | | | | | | | | | Conflicts: run
| | * Incorporate feedback from mateiz:Christoph Grothaus2013-02-241-4/+5
| | | | | | | | | | | | | | | - we do not need getEnvOrEmpty - Instead of saving SPARK_NONDAEMON_JAVA_OPTS, it would be better to modify the scripts to use a different variable name for the JAVA_OPTS they do eventually use
| | * Fix SPARK-698. From ExecutorRunner, launch java directly instead via the run ↵Christoph Grothaus2013-02-201-0/+3
| | | | | | | | | | | | scripts.
| * | Fix resolution of example code with Maven buildsMatei Zaharia2013-06-221-2/+6
| | |
* | | Get rid of debugging statementsEvan Chan2013-06-251-3/+0
| | |
* | | Split out source distro CLASSPATH logic to a separate scriptEvan Chan2013-06-241-104/+19
|/ /
* | Only check for repl classes if the user is running the repl. Otherwise,Reynold Xin2013-05-161-2/+8
| | | | | | | | | | check for core classes in run. This fixed the problem that core tests depend on whether repl module is compiled or not.
* | 1) Add support for HADOOP_CONF_DIR (and/or YARN_CONF_DIR - use either) : ↵Mridul Muralidharan2013-05-111-27/+38
| | | | | | | | | | | | which is used to specify the client side configuration directory : which needs to be part of the CLASSPATH. 2) Move from var+=".." to var="$var.." : the former does not work on older bash shells unfortunately.
* | Fix issues reported by ReynoldMridul Muralidharan2013-04-301-3/+4
| |
* | Reversed the order of tests to find a scala executable (in the case when ↵Mike2013-04-111-5/+6
| | | | | | | | | | | | | | | | SPARK_LAUNCH_WITH_SCALA is defined): instead of checking in the PATH first, and only then (if not found) for SCALA_HOME, now we check for SCALA_HOME first, and only then (if not defined) do we look in the PATH. The advantage is that now if the user has a more recent (non-compatible) version of scala in her PATH, she can use SCALA_HOME to point to the older (compatible) version for use with spark. Suggested by Josh Rosen in this thread: https://groups.google.com/forum/?fromgroups=#!topic/spark-users/NC9JKvP8808
* | Merge pull request #553 from pwendell/akka-standaloneMatei Zaharia2013-04-081-0/+1
|\ \ | | | | | | SPARK-724 - Have Akka logging enabled by default for standalone daemons
| * | Updating based on code reviewPatrick Wendell2013-04-071-1/+1
| | |
| * | SPARK-724 - Have Akka logging enabled by default for standalone daemonsPatrick Wendell2013-04-031-0/+1
| | | | | | | | | | | | | | | | | | | | | See the JIRA for more details. I was only able to test the bash version (don't have Windows) so maybe check over that the syntax is correct there.
* | | Merge remote-tracking branch 'kalpit/master'Matei Zaharia2013-04-071-1/+1
|/ / | | | | | | | | Conflicts: project/SparkBuild.scala
* | Small hack to work around multiple JARs being built by sbt packageMatei Zaharia2013-02-261-5/+6
| |
* | Pass a code JAR to SparkContext in our examples. Fixes SPARK-594.Matei Zaharia2013-02-251-0/+10
| |
* | Change tabs to spacesMatei Zaharia2013-02-251-15/+15
| |
* | Fixed class paths and dependencies based on Matei's comments.Tathagata Das2013-02-241-3/+2
| |
* | Merge branch 'mesos-master' into streamingTathagata Das2013-02-241-0/+20
|\ \
| * | support customized java options for master, worker, executor, repl shellhaitao.yao2013-02-161-0/+20
| |/
* | Merge branch 'mesos-master' into streamingTathagata Das2013-02-201-0/+12
|\| | | | | | | | | | | Conflicts: core/src/main/scala/spark/rdd/CheckpointRDD.scala streaming/src/main/scala/spark/streaming/dstream/ReducedWindowedDStream.scala
| * Use a separate memory setting for standalone cluster daemonsMatei Zaharia2013-02-101-0/+12
| | | | | | | | | | Conflicts: docs/_config.yml
* | Merge branch 'mesos-master' into streamingTathagata Das2013-02-071-3/+5
|\|
| * Update run script to deal with change to build of REPL shaded JARMatei Zaharia2013-01-201-3/+5
| |
* | Merge pull request #372 from Reinvigorate/sm-kafkaTathagata Das2013-02-071-0/+3
|\ \ | |/ |/| Removing offset management code that is non-existent in kafka 0.7.0+
| * kafka jar wasn't being included by run scriptseanm2013-01-181-0/+3
| |
* | Merge branch 'master' into streamingMatei Zaharia2013-01-201-0/+7
|\ \ | | | | | | | | | | | | Conflicts: core/src/main/scala/spark/api/python/PythonRDD.scala
| * | Warn users if they run pyspark or spark-shell without compiling SparkMatei Zaharia2013-01-171-0/+7
| | |
* | | Merge branch 'master' into streamingTathagata Das2013-01-151-9/+7
|\| | | |/ |/| | | | | | | | | | | | | Conflicts: core/src/main/scala/spark/rdd/CoGroupedRDD.scala core/src/main/scala/spark/rdd/FilteredRDD.scala docs/_layouts/global.html docs/index.md run
| * Merge pull request #346 from JoshRosen/python-apiMatei Zaharia2013-01-121-0/+4
| |\ | | | | | | Python API (PySpark)
| | * Rename top-level 'pyspark' directory to 'python'Josh Rosen2013-01-011-1/+1
| | |
| | * Merge remote-tracking branch 'origin/master' into python-apiJosh Rosen2012-12-291-18/+40
| | |\ | | | | | | | | | | | | | | | | Conflicts: docs/quick-start.md
| | * | Simplify PySpark installation.Josh Rosen2012-12-271-0/+4
| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | - Bundle Py4J binaries, since it's hard to install - Uses Spark's `run` script to launch the Py4J gateway, inheriting the settings in spark-env.sh With these changes, (hopefully) nothing more than running `sbt/sbt package` will be necessary to run PySpark.
| * | | Retrieve jars to a flat directory so * can be used for the classpath.Stephen Haberman2013-01-081-9/+3
| | |/ | |/|
* | | Removed streaming-env.sh.templateTathagata Das2013-01-061-4/+0
| | |
* | | Merge branch 'master' of github.com:mesos/spark into devReynold Xin2012-12-201-5/+10
|\| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | Conflicts: core/src/main/scala/spark/MapOutputTracker.scala core/src/main/scala/spark/PairRDDFunctions.scala core/src/main/scala/spark/ParallelCollection.scala core/src/main/scala/spark/RDD.scala core/src/main/scala/spark/rdd/BlockRDD.scala core/src/main/scala/spark/rdd/CartesianRDD.scala core/src/main/scala/spark/rdd/CoGroupedRDD.scala core/src/main/scala/spark/rdd/CoalescedRDD.scala core/src/main/scala/spark/rdd/FilteredRDD.scala core/src/main/scala/spark/rdd/FlatMappedRDD.scala core/src/main/scala/spark/rdd/GlommedRDD.scala core/src/main/scala/spark/rdd/HadoopRDD.scala core/src/main/scala/spark/rdd/MapPartitionsRDD.scala core/src/main/scala/spark/rdd/MapPartitionsWithSplitRDD.scala core/src/main/scala/spark/rdd/MappedRDD.scala core/src/main/scala/spark/rdd/PipedRDD.scala core/src/main/scala/spark/rdd/SampledRDD.scala core/src/main/scala/spark/rdd/ShuffledRDD.scala core/src/main/scala/spark/rdd/UnionRDD.scala core/src/main/scala/spark/storage/BlockManager.scala core/src/main/scala/spark/storage/BlockManagerId.scala core/src/main/scala/spark/storage/BlockManagerMaster.scala core/src/main/scala/spark/storage/StorageLevel.scala core/src/main/scala/spark/util/MetadataCleaner.scala core/src/main/scala/spark/util/TimeStampedHashMap.scala core/src/test/scala/spark/storage/BlockManagerSuite.scala run
| * | Make "run" script work with Maven buildsMatei Zaharia2012-12-101-5/+10
| | |
* | | Fixed bugs in RawNetworkInputDStream and in its examples. Made the ↵Tathagata Das2012-11-121-0/+4
| | | | | | | | | | | | ReducedWindowedDStream persist RDDs to MEMOERY_SER_ONLY by default. Removed unncessary examples. Added streaming-env.sh.template to add recommended setting for streaming.
* | | Merge remote-tracking branch 'public/master' into devMatei Zaharia2012-10-241-24/+34
|\| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | Conflicts: core/src/main/scala/spark/BlockStoreShuffleFetcher.scala core/src/main/scala/spark/KryoSerializer.scala core/src/main/scala/spark/MapOutputTracker.scala core/src/main/scala/spark/RDD.scala core/src/main/scala/spark/SparkContext.scala core/src/main/scala/spark/executor/Executor.scala core/src/main/scala/spark/network/Connection.scala core/src/main/scala/spark/network/ConnectionManagerTest.scala core/src/main/scala/spark/rdd/BlockRDD.scala core/src/main/scala/spark/rdd/NewHadoopRDD.scala core/src/main/scala/spark/scheduler/ShuffleMapTask.scala core/src/main/scala/spark/scheduler/cluster/StandaloneSchedulerBackend.scala core/src/main/scala/spark/storage/BlockManager.scala core/src/main/scala/spark/storage/BlockMessage.scala core/src/main/scala/spark/storage/BlockStore.scala core/src/main/scala/spark/storage/StorageLevel.scala core/src/main/scala/spark/util/AkkaUtils.scala project/SparkBuild.scala run
| * | Tweaked run file to live more happily with typesafe's debian packageThomas Dudziak2012-10-221-13/+30
| |/
| * Document how to configure SPARK_MEM & co on a per-job basisMatei Zaharia2012-10-131-7/+0
| |
| * Made run script add test-classes onto the classpath only if SPARK_TESTING is ↵root2012-10-071-2/+4
| | | | | | | | set; fixes #216
| * Don't check for JARs in core/lib anymoreMatei Zaharia2012-10-041-3/+0
| |
| * Update Scala version dependency to 2.9.2Matei Zaharia2012-09-241-1/+1
| |
| * Added a unit test for local-cluster mode and simplified some of the code ↵Matei Zaharia2012-09-071-0/+1
| | | | | | | | involved in that
* | Added the Spark Streaing code, ported to Akka 2Matei Zaharia2012-07-281-0/+2
|/