aboutsummaryrefslogtreecommitdiff
path: root/bin
Commit message (Collapse)AuthorAgeFilesLines
* Merge remote-tracking branch 'apache-github/master' into remove-binariesPatrick Wendell2014-01-0324-610/+712
|\ | | | | | | | | | | Conflicts: core/src/test/scala/org/apache/spark/DriverSuite.scala docs/python-programming-guide.md
| * sbin/compute-classpath* bin/compute-classpath*Prashant Sharma2014-01-034-2/+146
| |
| * sbin/spark-class* -> bin/spark-class*Prashant Sharma2014-01-036-4/+266
| |
| * run-example -> bin/run-examplePrashant Sharma2014-01-022-2/+2
| |
| * Merge branch 'scripts-reorg' of github.com:shane-huang/incubator-spark into ↵Prashant Sharma2014-01-0221-752/+448
|/| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | spark-915-segregate-scripts Conflicts: bin/spark-shell core/pom.xml core/src/main/scala/org/apache/spark/SparkContext.scala core/src/main/scala/org/apache/spark/scheduler/cluster/mesos/CoarseMesosSchedulerBackend.scala core/src/main/scala/org/apache/spark/ui/UIWorkloadGenerator.scala core/src/test/scala/org/apache/spark/DriverSuite.scala python/run-tests sbin/compute-classpath.sh sbin/spark-class sbin/stop-slaves.sh
| * deprecate "spark" script and SPAKR_CLASSPATH environment variableAndrew xia2013-10-121-92/+0
| |
| * refactor $FWD variableAndrew xia2013-09-293-4/+4
| |
| * rm bin/spark.cmd as we don't have windows test environment. Will added it ↵shane-huang2013-09-261-27/+0
| | | | | | | | | | | | later if needed Signed-off-by: shane-huang <shengsheng.huang@intel.com>
| * fix paths and change spark to use APP_MEM as application driver memory ↵shane-huang2013-09-261-33/+8
| | | | | | | | | | | | instead of SPARK_MEM, user should add application jars to SPARK_CLASSPATH Signed-off-by: shane-huang <shengsheng.huang@intel.com>
| * add scripts in binshane-huang2013-09-238-10/+155
| | | | | | | | Signed-off-by: shane-huang <shengsheng.huang@intel.com>
| * moved user scripts to bin foldershane-huang2013-09-238-0/+418
| | | | | | | | Signed-off-by: shane-huang <shengsheng.huang@intel.com>
| * add admin scripts to sbinshane-huang2013-09-2313-704/+0
| | | | | | | | Signed-off-by: shane-huang <shengsheng.huang@intel.com>
| * added spark-class and spark-executor to sbinshane-huang2013-09-231-1/+1
| | | | | | | | Signed-off-by: shane-huang <shengsheng.huang@intel.com>
* | Merge branch 'master' into scala-2.10Raymond Liu2013-11-135-9/+57
|\ \
| * \ Merge pull request #66 from shivaram/sbt-assembly-depsMatei Zaharia2013-10-181-4/+18
| |\ \ | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | Add SBT target to assemble dependencies This pull request is an attempt to address the long assembly build times during development. Instead of rebuilding the assembly jar for every Spark change, this pull request adds a new SBT target `spark` that packages all the Spark modules and builds an assembly of the dependencies. So the work flow that should work now would be something like ``` ./sbt/sbt spark # Doing this once should suffice ## Make changes ./sbt/sbt compile ./sbt/sbt test or ./spark-shell ```
| | * | Exclude assembly jar from classpath if using depsShivaram Venkataraman2013-10-161-10/+18
| | | |
| | * | Merge branch 'master' of https://github.com/apache/incubator-spark into ↵Shivaram Venkataraman2013-10-151-2/+0
| | |\ \ | | | | | | | | | | | | | | | sbt-assembly-deps
| | * | | Add new SBT target for dependency assemblyShivaram Venkataraman2013-10-091-0/+6
| | | |/ | | |/|
| * | | SPARK-627 , Implementing --config arguments in the scriptsKarthikTunga2013-10-161-1/+1
| | | |
| * | | SPARK-627 , Implementing --config arguments in the scriptsKarthikTunga2013-10-162-2/+2
| | | |
| * | | Implementing --config argument in the scriptsKarthikTunga2013-10-162-7/+10
| | | |
| * | | Merge branch 'master' of https://github.com/apache/incubator-sparkKarthikTunga2013-10-151-2/+0
| |\ \ \ | | | |/ | | |/| | | | | Updating local branch
| | * | Address Matei's commentsAaron Davidson2013-10-051-2/+0
| | | |
| | * | Standalone Scheduler fault recoveryAaron Davidson2013-09-261-1/+1
| | |/ | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | Implements a basic form of Standalone Scheduler fault recovery. In particular, this allows faults to be manually recovered from by means of restarting the Master process on the same machine. This is the majority of the code necessary for general fault tolerance, which will first elect a leader and then recover the Master state. In order to enable fault recovery, the Master will persist a small amount of state related to the registration of Workers and Applications to disk. If the Master is started and sees that this state is still around, it will enter Recovery mode, during which time it will not schedule any new Executors on Workers (but it does accept the registration of new Clients and Workers). At this point, the Master attempts to reconnect to all Workers and Client applications that were registered at the time of failure. After confirming either the existence or nonexistence of all such nodes (within a certain timeout), the Master will exit Recovery mode and resume normal scheduling.
| * / SPARK-627 - reading --config argumentKarthikTunga2013-10-152-0/+33
| |/
* | version changed 2.9.3 -> 2.10 in shell script.Prashant Sharma2013-09-151-1/+1
| |
* | Merged with masterPrashant Sharma2013-09-0613-121/+251
|\|
| * Run script fixes for Windows after package & assembly changeMatei Zaharia2013-09-011-26/+20
| |
| * Initial work to rename package to org.apache.sparkMatei Zaharia2013-09-014-5/+5
| |
| * Print output from spark-daemon only when it fails to launchMatei Zaharia2013-08-314-8/+15
| |
| * Delete some code that was added back in a merge and print less info inMatei Zaharia2013-08-311-3/+0
| | | | | | | | spark-daemon
| * Fix finding of assembly JAR, as well as some pointers to ./runMatei Zaharia2013-08-291-1/+1
| |
| * Change build and run instructions to use assembliesMatei Zaharia2013-08-292-72/+18
| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | This commit makes Spark invocation saner by using an assembly JAR to find all of Spark's dependencies instead of adding all the JARs in lib_managed. It also packages the examples into an assembly and uses that as SPARK_EXAMPLES_JAR. Finally, it replaces the old "run" script with two better-named scripts: "run-examples" for examples, and "spark-class" for Spark internal classes (e.g. REPL, master, etc). This is also designed to minimize the confusion people have in trying to use "run" to run their own classes; it's not meant to do that, but now at least if they look at it, they can modify run-examples to do a decent job for them. As part of this, Bagel's examples are also now properly moved to the examples package instead of bagel.
| * Don't assume spark-examples JAR always existsJey Kottalam2013-08-181-2/+3
| |
| * Maven build now also works with YARNJey Kottalam2013-08-161-1/+1
| |
| * yarn supportJey Kottalam2013-08-152-0/+4
| |
| * Log the launch command for Spark daemonsPatrick Wendell2013-08-021-1/+4
| | | | | | | | | | | | For debugging and analysis purposes, it's nice to have the exact command used to launch Spark contained within the logs. This adds the necessary hooks to make that possible.
| * Fix setting of SPARK_EXAMPLES_JARJey Kottalam2013-07-241-11/+0
| |
| * Add JavaAPICompletenessChecker.Josh Rosen2013-07-222-0/+4
| | | | | | | | | | | | | | | | | | | | | | This is used to find methods in the Scala API that need to be ported to the Java API. To use it: ./run spark.tools.JavaAPICompletenessChecker Conflicts: project/SparkBuild.scala run run2.cmd
| * Consistently invoke bash with /usr/bin/env bash in scripts to make code more ↵Ubuntu2013-07-181-1/+1
| | | | | | | | portable (JIRA Ticket SPARK-817)
| * Some missing license headersMatei Zaharia2013-07-162-0/+34
| |
| * Add Apache license headers and LICENSE and NOTICE filesMatei Zaharia2013-07-1611-7/+158
| |
* | Merge branch 'master' of github.com:mesos/spark into scala-2.10Prashant Sharma2013-07-152-49/+65
|\| | | | | | | | | | | | | | | Conflicts: core/src/main/scala/spark/Utils.scala core/src/test/scala/spark/ui/UISuite.scala project/SparkBuild.scala run
| * Merge remote-tracking branch 'origin/pr/662'Matei Zaharia2013-07-132-49/+65
| |\ | | | | | | | | | | | | Conflicts: bin/compute-classpath.sh
| | * Merge branch 'master' into 2013-06/assembly-jar-deployEvan Chan2013-06-282-0/+154
| | |\ | | | | | | | | | | | | | | | | | | | | | | | | | | | | Conflicts: run Previous changes that I made to run and set-dev-classpath.sh instead have been folded into compute-classpath.sh
| | * | Add simple usage to start-slave scriptEvan Chan2013-06-241-0/+3
| | | |
* | | | Merge branch 'master' into master-mergePrashant Sharma2013-07-122-0/+4
|\| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | Conflicts: README.md core/pom.xml core/src/main/scala/spark/deploy/JsonProtocol.scala core/src/main/scala/spark/deploy/LocalSparkCluster.scala core/src/main/scala/spark/deploy/master/Master.scala core/src/main/scala/spark/deploy/master/MasterWebUI.scala core/src/main/scala/spark/deploy/worker/Worker.scala core/src/main/scala/spark/deploy/worker/WorkerWebUI.scala core/src/main/scala/spark/storage/BlockManagerUI.scala core/src/main/scala/spark/util/AkkaUtils.scala pom.xml project/SparkBuild.scala streaming/src/main/scala/spark/streaming/receivers/ActorReceiver.scala
| * | | Renamed ML package to MLlib and added it to classpathMatei Zaharia2013-07-052-0/+4
| | |/ | |/|
* / | Removed some unnecessary code and fixed dependenciesPrashant Sharma2013-07-111-1/+1
|/ /
* | Fixes to compute-classpath on WindowsMatei Zaharia2013-06-261-2/+2
| |