aboutsummaryrefslogtreecommitdiff
path: root/run
Commit message (Collapse)AuthorAgeFilesLines
* Only check for repl classes if the user is running the repl. Otherwise,Reynold Xin2013-05-161-2/+8
| | | | | check for core classes in run. This fixed the problem that core tests depend on whether repl module is compiled or not.
* 1) Add support for HADOOP_CONF_DIR (and/or YARN_CONF_DIR - use either) : ↵Mridul Muralidharan2013-05-111-27/+38
| | | | | | which is used to specify the client side configuration directory : which needs to be part of the CLASSPATH. 2) Move from var+=".." to var="$var.." : the former does not work on older bash shells unfortunately.
* Fix issues reported by ReynoldMridul Muralidharan2013-04-301-3/+4
|
* Reversed the order of tests to find a scala executable (in the case when ↵Mike2013-04-111-5/+6
| | | | | | | | SPARK_LAUNCH_WITH_SCALA is defined): instead of checking in the PATH first, and only then (if not found) for SCALA_HOME, now we check for SCALA_HOME first, and only then (if not defined) do we look in the PATH. The advantage is that now if the user has a more recent (non-compatible) version of scala in her PATH, she can use SCALA_HOME to point to the older (compatible) version for use with spark. Suggested by Josh Rosen in this thread: https://groups.google.com/forum/?fromgroups=#!topic/spark-users/NC9JKvP8808
* Merge pull request #553 from pwendell/akka-standaloneMatei Zaharia2013-04-081-0/+1
|\ | | | | SPARK-724 - Have Akka logging enabled by default for standalone daemons
| * Updating based on code reviewPatrick Wendell2013-04-071-1/+1
| |
| * SPARK-724 - Have Akka logging enabled by default for standalone daemonsPatrick Wendell2013-04-031-0/+1
| | | | | | | | | | | | | | See the JIRA for more details. I was only able to test the bash version (don't have Windows) so maybe check over that the syntax is correct there.
* | Merge remote-tracking branch 'kalpit/master'Matei Zaharia2013-04-071-1/+1
|/ | | | | Conflicts: project/SparkBuild.scala
* Small hack to work around multiple JARs being built by sbt packageMatei Zaharia2013-02-261-5/+6
|
* Pass a code JAR to SparkContext in our examples. Fixes SPARK-594.Matei Zaharia2013-02-251-0/+10
|
* Change tabs to spacesMatei Zaharia2013-02-251-15/+15
|
* Fixed class paths and dependencies based on Matei's comments.Tathagata Das2013-02-241-3/+2
|
* Merge branch 'mesos-master' into streamingTathagata Das2013-02-241-0/+20
|\
| * support customized java options for master, worker, executor, repl shellhaitao.yao2013-02-161-0/+20
| |
* | Merge branch 'mesos-master' into streamingTathagata Das2013-02-201-0/+12
|\| | | | | | | | | | | Conflicts: core/src/main/scala/spark/rdd/CheckpointRDD.scala streaming/src/main/scala/spark/streaming/dstream/ReducedWindowedDStream.scala
| * Use a separate memory setting for standalone cluster daemonsMatei Zaharia2013-02-101-0/+12
| | | | | | | | | | Conflicts: docs/_config.yml
* | Merge branch 'mesos-master' into streamingTathagata Das2013-02-071-3/+5
|\|
| * Update run script to deal with change to build of REPL shaded JARMatei Zaharia2013-01-201-3/+5
| |
* | Merge pull request #372 from Reinvigorate/sm-kafkaTathagata Das2013-02-071-0/+3
|\ \ | |/ |/| Removing offset management code that is non-existent in kafka 0.7.0+
| * kafka jar wasn't being included by run scriptseanm2013-01-181-0/+3
| |
* | Merge branch 'master' into streamingMatei Zaharia2013-01-201-0/+7
|\ \ | | | | | | | | | | | | Conflicts: core/src/main/scala/spark/api/python/PythonRDD.scala
| * | Warn users if they run pyspark or spark-shell without compiling SparkMatei Zaharia2013-01-171-0/+7
| | |
* | | Merge branch 'master' into streamingTathagata Das2013-01-151-9/+7
|\| | | |/ |/| | | | | | | | | | | | | Conflicts: core/src/main/scala/spark/rdd/CoGroupedRDD.scala core/src/main/scala/spark/rdd/FilteredRDD.scala docs/_layouts/global.html docs/index.md run
| * Merge pull request #346 from JoshRosen/python-apiMatei Zaharia2013-01-121-0/+4
| |\ | | | | | | Python API (PySpark)
| | * Rename top-level 'pyspark' directory to 'python'Josh Rosen2013-01-011-1/+1
| | |
| | * Merge remote-tracking branch 'origin/master' into python-apiJosh Rosen2012-12-291-18/+40
| | |\ | | | | | | | | | | | | | | | | Conflicts: docs/quick-start.md
| | * | Simplify PySpark installation.Josh Rosen2012-12-271-0/+4
| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | - Bundle Py4J binaries, since it's hard to install - Uses Spark's `run` script to launch the Py4J gateway, inheriting the settings in spark-env.sh With these changes, (hopefully) nothing more than running `sbt/sbt package` will be necessary to run PySpark.
| * | | Retrieve jars to a flat directory so * can be used for the classpath.Stephen Haberman2013-01-081-9/+3
| | |/ | |/|
* | | Removed streaming-env.sh.templateTathagata Das2013-01-061-4/+0
| | |
* | | Merge branch 'master' of github.com:mesos/spark into devReynold Xin2012-12-201-5/+10
|\| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | Conflicts: core/src/main/scala/spark/MapOutputTracker.scala core/src/main/scala/spark/PairRDDFunctions.scala core/src/main/scala/spark/ParallelCollection.scala core/src/main/scala/spark/RDD.scala core/src/main/scala/spark/rdd/BlockRDD.scala core/src/main/scala/spark/rdd/CartesianRDD.scala core/src/main/scala/spark/rdd/CoGroupedRDD.scala core/src/main/scala/spark/rdd/CoalescedRDD.scala core/src/main/scala/spark/rdd/FilteredRDD.scala core/src/main/scala/spark/rdd/FlatMappedRDD.scala core/src/main/scala/spark/rdd/GlommedRDD.scala core/src/main/scala/spark/rdd/HadoopRDD.scala core/src/main/scala/spark/rdd/MapPartitionsRDD.scala core/src/main/scala/spark/rdd/MapPartitionsWithSplitRDD.scala core/src/main/scala/spark/rdd/MappedRDD.scala core/src/main/scala/spark/rdd/PipedRDD.scala core/src/main/scala/spark/rdd/SampledRDD.scala core/src/main/scala/spark/rdd/ShuffledRDD.scala core/src/main/scala/spark/rdd/UnionRDD.scala core/src/main/scala/spark/storage/BlockManager.scala core/src/main/scala/spark/storage/BlockManagerId.scala core/src/main/scala/spark/storage/BlockManagerMaster.scala core/src/main/scala/spark/storage/StorageLevel.scala core/src/main/scala/spark/util/MetadataCleaner.scala core/src/main/scala/spark/util/TimeStampedHashMap.scala core/src/test/scala/spark/storage/BlockManagerSuite.scala run
| * | Make "run" script work with Maven buildsMatei Zaharia2012-12-101-5/+10
| | |
* | | Fixed bugs in RawNetworkInputDStream and in its examples. Made the ↵Tathagata Das2012-11-121-0/+4
| | | | | | | | | | | | ReducedWindowedDStream persist RDDs to MEMOERY_SER_ONLY by default. Removed unncessary examples. Added streaming-env.sh.template to add recommended setting for streaming.
* | | Merge remote-tracking branch 'public/master' into devMatei Zaharia2012-10-241-24/+34
|\| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | Conflicts: core/src/main/scala/spark/BlockStoreShuffleFetcher.scala core/src/main/scala/spark/KryoSerializer.scala core/src/main/scala/spark/MapOutputTracker.scala core/src/main/scala/spark/RDD.scala core/src/main/scala/spark/SparkContext.scala core/src/main/scala/spark/executor/Executor.scala core/src/main/scala/spark/network/Connection.scala core/src/main/scala/spark/network/ConnectionManagerTest.scala core/src/main/scala/spark/rdd/BlockRDD.scala core/src/main/scala/spark/rdd/NewHadoopRDD.scala core/src/main/scala/spark/scheduler/ShuffleMapTask.scala core/src/main/scala/spark/scheduler/cluster/StandaloneSchedulerBackend.scala core/src/main/scala/spark/storage/BlockManager.scala core/src/main/scala/spark/storage/BlockMessage.scala core/src/main/scala/spark/storage/BlockStore.scala core/src/main/scala/spark/storage/StorageLevel.scala core/src/main/scala/spark/util/AkkaUtils.scala project/SparkBuild.scala run
| * | Tweaked run file to live more happily with typesafe's debian packageThomas Dudziak2012-10-221-13/+30
| |/
| * Document how to configure SPARK_MEM & co on a per-job basisMatei Zaharia2012-10-131-7/+0
| |
| * Made run script add test-classes onto the classpath only if SPARK_TESTING is ↵root2012-10-071-2/+4
| | | | | | | | set; fixes #216
| * Don't check for JARs in core/lib anymoreMatei Zaharia2012-10-041-3/+0
| |
| * Update Scala version dependency to 2.9.2Matei Zaharia2012-09-241-1/+1
| |
| * Added a unit test for local-cluster mode and simplified some of the code ↵Matei Zaharia2012-09-071-0/+1
| | | | | | | | involved in that
* | Added the Spark Streaing code, ported to Akka 2Matei Zaharia2012-07-281-0/+2
|/
* Fixed SPARK_MEM not being passed when runner is javaMatei Zaharia2012-07-281-1/+4
|
* More work to allow Spark to run on the standalone deploy cluster.Matei Zaharia2012-07-081-7/+26
|
* More work on deploy code (adding Worker class)Matei Zaharia2012-06-301-0/+1
|
* Further fixes to how Mesos is found and usedMatei Zaharia2012-03-171-21/+23
|
* Set SCALA_VERSION to 2.9.1 (from 2.9.1.final) to match expectation of SBT 0.11.0Ismael Juma2011-09-261-1/+1
|
* Upgrade to Scala 2.9.1.Ismael Juma2011-08-311-1/+1
| | | | | | | | | Interestingly, the version in Maven is 2.9.1, but SBT outputs file to the 2.9.1.final directory inside target. A couple of small changes in SparkIMain were also required. All tests pass and ./spark-shell launches successfully.
* Removed a debugging lineMatei Zaharia2011-08-291-1/+0
|
* Merge branch 'scala-2.9'Matei Zaharia2011-08-011-8/+9
|\ | | | | | | | | Conflicts: project/build/SparkProject.scala
| * Initial work on converting build to SBT 0.10.1Ismael Juma2011-07-151-8/+9
| |
| * Pass quoted arguments properly to runMatei Zaharia2011-05-311-1/+1
| |