diff options
author | Patrick Wendell <pwendell@gmail.com> | 2014-04-24 23:59:16 -0700 |
---|---|---|
committer | Patrick Wendell <pwendell@gmail.com> | 2014-04-24 23:59:16 -0700 |
commit | dc3b640a0ab3501b678b591be3e99fbcf3badbec (patch) | |
tree | 2865c2a3cef66f061d846f6a968725e83728271b /docs/spark-standalone.md | |
parent | 6e101f1183f92769779bc8ac14813c063bf1ff3f (diff) | |
download | spark-dc3b640a0ab3501b678b591be3e99fbcf3badbec.tar.gz spark-dc3b640a0ab3501b678b591be3e99fbcf3badbec.tar.bz2 spark-dc3b640a0ab3501b678b591be3e99fbcf3badbec.zip |
SPARK-1619 Launch spark-shell with spark-submit
This simplifies the shell a bunch and passes all arguments through to spark-submit.
There is a tiny incompatibility from 0.9.1 which is that you can't put `-c` _or_ `--cores`, only `--cores`. However, spark-submit will give a good error message in this case, I don't think many people used this, and it's a trivial change for users.
Author: Patrick Wendell <pwendell@gmail.com>
Closes #542 from pwendell/spark-shell and squashes the following commits:
9eb3e6f [Patrick Wendell] Updating Spark docs
b552459 [Patrick Wendell] Andrew's feedback
97720fa [Patrick Wendell] Review feedback
aa2900b [Patrick Wendell] SPARK-1619 Launch spark-shell with spark-submit
Diffstat (limited to 'docs/spark-standalone.md')
-rw-r--r-- | docs/spark-standalone.md | 4 |
1 files changed, 2 insertions, 2 deletions
diff --git a/docs/spark-standalone.md b/docs/spark-standalone.md index 7e4eea323a..dc7f206e03 100644 --- a/docs/spark-standalone.md +++ b/docs/spark-standalone.md @@ -139,12 +139,12 @@ constructor](scala-programming-guide.html#initializing-spark). To run an interactive Spark shell against the cluster, run the following command: - MASTER=spark://IP:PORT ./bin/spark-shell + ./bin/spark-shell --master spark://IP:PORT Note that if you are running spark-shell from one of the spark cluster machines, the `bin/spark-shell` script will automatically set MASTER from the `SPARK_MASTER_IP` and `SPARK_MASTER_PORT` variables in `conf/spark-env.sh`. -You can also pass an option `-c <numCores>` to control the number of cores that spark-shell uses on the cluster. +You can also pass an option `--cores <numCores>` to control the number of cores that spark-shell uses on the cluster. # Launching Compiled Spark Applications |