aboutsummaryrefslogtreecommitdiff
path: root/bin
Commit message (Collapse)AuthorAgeFilesLines
* Merge branch 'master' into scala-2.10Raymond Liu2013-11-135-9/+57
|\
| * Merge pull request #66 from shivaram/sbt-assembly-depsMatei Zaharia2013-10-181-4/+18
| |\ | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | Add SBT target to assemble dependencies This pull request is an attempt to address the long assembly build times during development. Instead of rebuilding the assembly jar for every Spark change, this pull request adds a new SBT target `spark` that packages all the Spark modules and builds an assembly of the dependencies. So the work flow that should work now would be something like ``` ./sbt/sbt spark # Doing this once should suffice ## Make changes ./sbt/sbt compile ./sbt/sbt test or ./spark-shell ```
| | * Exclude assembly jar from classpath if using depsShivaram Venkataraman2013-10-161-10/+18
| | |
| | * Merge branch 'master' of https://github.com/apache/incubator-spark into ↵Shivaram Venkataraman2013-10-151-2/+0
| | |\ | | | | | | | | | | | | sbt-assembly-deps
| | * | Add new SBT target for dependency assemblyShivaram Venkataraman2013-10-091-0/+6
| | | |
| * | | SPARK-627 , Implementing --config arguments in the scriptsKarthikTunga2013-10-161-1/+1
| | | |
| * | | SPARK-627 , Implementing --config arguments in the scriptsKarthikTunga2013-10-162-2/+2
| | | |
| * | | Implementing --config argument in the scriptsKarthikTunga2013-10-162-7/+10
| | | |
| * | | Merge branch 'master' of https://github.com/apache/incubator-sparkKarthikTunga2013-10-151-2/+0
| |\ \ \ | | | |/ | | |/| | | | | Updating local branch
| | * | Address Matei's commentsAaron Davidson2013-10-051-2/+0
| | | |
| | * | Standalone Scheduler fault recoveryAaron Davidson2013-09-261-1/+1
| | |/ | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | Implements a basic form of Standalone Scheduler fault recovery. In particular, this allows faults to be manually recovered from by means of restarting the Master process on the same machine. This is the majority of the code necessary for general fault tolerance, which will first elect a leader and then recover the Master state. In order to enable fault recovery, the Master will persist a small amount of state related to the registration of Workers and Applications to disk. If the Master is started and sees that this state is still around, it will enter Recovery mode, during which time it will not schedule any new Executors on Workers (but it does accept the registration of new Clients and Workers). At this point, the Master attempts to reconnect to all Workers and Client applications that were registered at the time of failure. After confirming either the existence or nonexistence of all such nodes (within a certain timeout), the Master will exit Recovery mode and resume normal scheduling.
| * / SPARK-627 - reading --config argumentKarthikTunga2013-10-152-0/+33
| |/
* | version changed 2.9.3 -> 2.10 in shell script.Prashant Sharma2013-09-151-1/+1
| |
* | Merged with masterPrashant Sharma2013-09-0613-121/+251
|\|
| * Run script fixes for Windows after package & assembly changeMatei Zaharia2013-09-011-26/+20
| |
| * Initial work to rename package to org.apache.sparkMatei Zaharia2013-09-014-5/+5
| |
| * Print output from spark-daemon only when it fails to launchMatei Zaharia2013-08-314-8/+15
| |
| * Delete some code that was added back in a merge and print less info inMatei Zaharia2013-08-311-3/+0
| | | | | | | | spark-daemon
| * Fix finding of assembly JAR, as well as some pointers to ./runMatei Zaharia2013-08-291-1/+1
| |
| * Change build and run instructions to use assembliesMatei Zaharia2013-08-292-72/+18
| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | This commit makes Spark invocation saner by using an assembly JAR to find all of Spark's dependencies instead of adding all the JARs in lib_managed. It also packages the examples into an assembly and uses that as SPARK_EXAMPLES_JAR. Finally, it replaces the old "run" script with two better-named scripts: "run-examples" for examples, and "spark-class" for Spark internal classes (e.g. REPL, master, etc). This is also designed to minimize the confusion people have in trying to use "run" to run their own classes; it's not meant to do that, but now at least if they look at it, they can modify run-examples to do a decent job for them. As part of this, Bagel's examples are also now properly moved to the examples package instead of bagel.
| * Don't assume spark-examples JAR always existsJey Kottalam2013-08-181-2/+3
| |
| * Maven build now also works with YARNJey Kottalam2013-08-161-1/+1
| |
| * yarn supportJey Kottalam2013-08-152-0/+4
| |
| * Log the launch command for Spark daemonsPatrick Wendell2013-08-021-1/+4
| | | | | | | | | | | | For debugging and analysis purposes, it's nice to have the exact command used to launch Spark contained within the logs. This adds the necessary hooks to make that possible.
| * Fix setting of SPARK_EXAMPLES_JARJey Kottalam2013-07-241-11/+0
| |
| * Add JavaAPICompletenessChecker.Josh Rosen2013-07-222-0/+4
| | | | | | | | | | | | | | | | | | | | | | This is used to find methods in the Scala API that need to be ported to the Java API. To use it: ./run spark.tools.JavaAPICompletenessChecker Conflicts: project/SparkBuild.scala run run2.cmd
| * Consistently invoke bash with /usr/bin/env bash in scripts to make code more ↵Ubuntu2013-07-181-1/+1
| | | | | | | | portable (JIRA Ticket SPARK-817)
| * Some missing license headersMatei Zaharia2013-07-162-0/+34
| |
| * Add Apache license headers and LICENSE and NOTICE filesMatei Zaharia2013-07-1611-7/+158
| |
* | Merge branch 'master' of github.com:mesos/spark into scala-2.10Prashant Sharma2013-07-152-49/+65
|\| | | | | | | | | | | | | | | Conflicts: core/src/main/scala/spark/Utils.scala core/src/test/scala/spark/ui/UISuite.scala project/SparkBuild.scala run
| * Merge remote-tracking branch 'origin/pr/662'Matei Zaharia2013-07-132-49/+65
| |\ | | | | | | | | | | | | Conflicts: bin/compute-classpath.sh
| | * Merge branch 'master' into 2013-06/assembly-jar-deployEvan Chan2013-06-282-0/+154
| | |\ | | | | | | | | | | | | | | | | | | | | | | | | | | | | Conflicts: run Previous changes that I made to run and set-dev-classpath.sh instead have been folded into compute-classpath.sh
| | * | Add simple usage to start-slave scriptEvan Chan2013-06-241-0/+3
| | | |
* | | | Merge branch 'master' into master-mergePrashant Sharma2013-07-122-0/+4
|\| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | Conflicts: README.md core/pom.xml core/src/main/scala/spark/deploy/JsonProtocol.scala core/src/main/scala/spark/deploy/LocalSparkCluster.scala core/src/main/scala/spark/deploy/master/Master.scala core/src/main/scala/spark/deploy/master/MasterWebUI.scala core/src/main/scala/spark/deploy/worker/Worker.scala core/src/main/scala/spark/deploy/worker/WorkerWebUI.scala core/src/main/scala/spark/storage/BlockManagerUI.scala core/src/main/scala/spark/util/AkkaUtils.scala pom.xml project/SparkBuild.scala streaming/src/main/scala/spark/streaming/receivers/ActorReceiver.scala
| * | | Renamed ML package to MLlib and added it to classpathMatei Zaharia2013-07-052-0/+4
| | |/ | |/|
* / | Removed some unnecessary code and fixed dependenciesPrashant Sharma2013-07-111-1/+1
|/ /
* | Fixes to compute-classpath on WindowsMatei Zaharia2013-06-261-2/+2
| |
* | Fix computation of classpath when we launch java directlyMatei Zaharia2013-06-252-0/+141
| | | | | | | | | | | | | | | | | | The previous version assumed that a CLASSPATH environment variable was set by the "run" script when launching the process that starts the ExecutorRunner, but unfortunately this is not true in tests. Instead, we factor the classpath calculation into an extenral script and call that. NOTE: This includes a Windows version but hasn't yet been tested there.
* | Revert "Fix start-slave not passing instance number to spark-daemon."Matei Zaharia2013-06-111-1/+1
| | | | | | | | This reverts commit a674d67c0aebb940e3b816e2307206115baec175.
* | Fix start-slave not passing instance number to spark-daemon.Stephen Haberman2013-05-281-1/+1
|/
* Use ec2-metadata in start-slave.sh.Josh Rosen2013-05-241-1/+2
| | | | | | | | PR #419 applied the same change, but only to start-master.sh, so some workers were still starting their web UI's using internal addresses. This should finally fix SPARK-613.
* spark instance number must be present in log filename to prevent multiple ↵kalpit2013-03-261-2/+2
| | | | workers from overriding each other's logs
* added SPARK_WORKER_INSTANCES : allows spawning multiple worker ↵kalpit2013-03-267-8/+29
| | | | instances/processes on every slave machine
* Detect whether we run on EC2 using ec2-metadata as wellShivaram Venkataraman2013-01-261-1/+2
|
* Use spark-env.sh to configure standalone master. See SPARK-638.Josh Rosen2012-12-143-5/+19
| | | | Also fixed a typo in the standalone mode documentation.
* Use external addresses in standalone WebUI on EC2.Josh Rosen2012-12-013-3/+28
|
* Use hostname instead of IP in deploy scripts to let Akka connect properlyMatei Zaharia2012-11-271-13/+2
|
* Use SPARK_MASTER_IP if it is set in start-slaves.sh.Reynold Xin2012-10-191-2/+16
|
* Update license info on deploy scriptsMatei Zaharia2012-09-252-0/+6
|
* Add Apache license to non-trivial scripts taken from Hadoop.Denny2012-08-042-0/+30
|