| Commit message (Collapse) | Author | Age | Files | Lines |
|
|
|
|
|
|
|
|
|
|
| |
Better account for various side-effect outputs while executing
"mvn help:evaluate -Dexpression=project.version"
Author: Rahul Singhal <rahul.singhal@guavus.com>
Closes #572 from rahulsinghaliitd/SPARK-1650 and squashes the following commits:
fd6a611 [Rahul Singhal] SPARK-1650: Correctly identify maven project version
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
This simplifies the shell a bunch and passes all arguments through to spark-submit.
There is a tiny incompatibility from 0.9.1 which is that you can't put `-c` _or_ `--cores`, only `--cores`. However, spark-submit will give a good error message in this case, I don't think many people used this, and it's a trivial change for users.
Author: Patrick Wendell <pwendell@gmail.com>
Closes #542 from pwendell/spark-shell and squashes the following commits:
9eb3e6f [Patrick Wendell] Updating Spark docs
b552459 [Patrick Wendell] Andrew's feedback
97720fa [Patrick Wendell] Review feedback
aa2900b [Patrick Wendell] SPARK-1619 Launch spark-shell with spark-submit
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
| |
1. Makes assembly and examples jar naming consistent in maven/sbt.
2. Updates make-distribution.sh to use Maven and fixes some bugs.
3. Updates the create-release script to call make-distribution script.
Author: Patrick Wendell <pwendell@gmail.com>
Closes #502 from pwendell/make-distribution and squashes the following commits:
1a97f0d [Patrick Wendell] SPARK-1119 and other build improvements
|
|
|
|
|
|
|
|
| |
Author: Nick Lanham <nick@afternight.org>
Closes #264 from nicklan/make-distribution-fixes and squashes the following commits:
172b981 [Nick Lanham] fix path for jar, make sed actually work on OSX
|
|
|
|
|
|
|
|
|
|
|
| |
I don't have access to an OSX machine, so if someone could test this that would be great.
Author: Nick Lanham <nick@afternight.org>
Closes #258 from nicklan/osx-sed-fix and squashes the following commits:
a6f158f [Nick Lanham] Also make mktemp work on OSX
558fd6e [Nick Lanham] Make sed do -i '' on OSX
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
This should all work as expected with the current version of the tachyon tarball (0.4.1)
Author: Nick Lanham <nick@afternight.org>
Closes #137 from nicklan/bundle-tachyon and squashes the following commits:
2eee15b [Nick Lanham] Put back in exec, start tachyon first
738ba23 [Nick Lanham] Move tachyon out of sbin
f2f9bc6 [Nick Lanham] More checks for tachyon script
111e8e1 [Nick Lanham] Only try tachyon operations if tachyon script exists
0561574 [Nick Lanham] Copy over web resources so web interface can run
4dc9809 [Nick Lanham] Update to tachyon 0.4.1
0a1a20c [Nick Lanham] Add scripts using tachyon tarball
|
| |
|
| |
|
| |
|
|\
| |
| |
| |
| |
| | |
Conflicts:
core/src/test/scala/org/apache/spark/DriverSuite.scala
docs/python-programming-guide.md
|
| | |
|
| | |
|
| |\
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | | |
spark-915-segregate-scripts
Conflicts:
bin/spark-shell
core/pom.xml
core/src/main/scala/org/apache/spark/SparkContext.scala
core/src/main/scala/org/apache/spark/scheduler/cluster/mesos/CoarseMesosSchedulerBackend.scala
core/src/main/scala/org/apache/spark/ui/UIWorkloadGenerator.scala
core/src/test/scala/org/apache/spark/DriverSuite.scala
python/run-tests
sbin/compute-classpath.sh
sbin/spark-class
sbin/stop-slaves.sh
|
| | |
| | |
| | |
| | | |
Signed-off-by: shane-huang <shengsheng.huang@intel.com>
|
| | |
| | |
| | |
| | | |
Closes #316
|
|/ / |
|
|/ |
|
| |
|
| |
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
This commit makes Spark invocation saner by using an assembly JAR to
find all of Spark's dependencies instead of adding all the JARs in
lib_managed. It also packages the examples into an assembly and uses
that as SPARK_EXAMPLES_JAR. Finally, it replaces the old "run" script
with two better-named scripts: "run-examples" for examples, and
"spark-class" for Spark internal classes (e.g. REPL, master, etc). This
is also designed to minimize the confusion people have in trying to use
"run" to run their own classes; it's not meant to do that, but now at
least if they look at it, they can modify run-examples to do a decent
job for them.
As part of this, Bagel's examples are also now properly moved to the
examples package instead of bagel.
|
| |
|
| |
|
| |
|
|
|
|
|
|
|
| |
requiring Spark to be installed. Using 'make_distribution.sh' a user
can put a Spark distribution at a URI supported by Mesos (e.g.,
'hdfs://...') and then set that when launching their job. Also added
SPARK_EXECUTOR_URI for the REPL.
|
|\
| |
| |
| |
| | |
Conflicts:
make-distribution.sh
|
| | |
|
|/ |
|
|
|
|
|
| |
Conflicts:
bin/compute-classpath.sh
|
| |
|
|
|