aboutsummaryrefslogtreecommitdiff
path: root/docs/spark-standalone.md
diff options
context:
space:
mode:
authorshane-huang <shengsheng.huang@intel.com>2013-09-23 12:42:34 +0800
committershane-huang <shengsheng.huang@intel.com>2013-09-23 12:42:34 +0800
commitfcfe4f920484b64b01e4e22219d59c78ffd17054 (patch)
treee7e489706f787016fc1278e4babd27b01df3dc7f /docs/spark-standalone.md
parentdfbdc9ddb773e2b1149e6a6c661f14b631b692d0 (diff)
downloadspark-fcfe4f920484b64b01e4e22219d59c78ffd17054.tar.gz
spark-fcfe4f920484b64b01e4e22219d59c78ffd17054.tar.bz2
spark-fcfe4f920484b64b01e4e22219d59c78ffd17054.zip
add admin scripts to sbin
Signed-off-by: shane-huang <shengsheng.huang@intel.com>
Diffstat (limited to 'docs/spark-standalone.md')
-rw-r--r--docs/spark-standalone.md12
1 files changed, 6 insertions, 6 deletions
diff --git a/docs/spark-standalone.md b/docs/spark-standalone.md
index 9d4ad1ec8d..b3f9160673 100644
--- a/docs/spark-standalone.md
+++ b/docs/spark-standalone.md
@@ -67,12 +67,12 @@ To launch a Spark standalone cluster with the launch scripts, you need to create
Once you've set up this file, you can launch or stop your cluster with the following shell scripts, based on Hadoop's deploy scripts, and available in `SPARK_HOME/bin`:
-- `bin/start-master.sh` - Starts a master instance on the machine the script is executed on.
-- `bin/start-slaves.sh` - Starts a slave instance on each machine specified in the `conf/slaves` file.
-- `bin/start-all.sh` - Starts both a master and a number of slaves as described above.
-- `bin/stop-master.sh` - Stops the master that was started via the `bin/start-master.sh` script.
-- `bin/stop-slaves.sh` - Stops the slave instances that were started via `bin/start-slaves.sh`.
-- `bin/stop-all.sh` - Stops both the master and the slaves as described above.
+- `sbin/start-master.sh` - Starts a master instance on the machine the script is executed on.
+- `sbin/start-slaves.sh` - Starts a slave instance on each machine specified in the `conf/slaves` file.
+- `sbin/start-all.sh` - Starts both a master and a number of slaves as described above.
+- `sbin/stop-master.sh` - Stops the master that was started via the `bin/start-master.sh` script.
+- `sbin/stop-slaves.sh` - Stops the slave instances that were started via `bin/start-slaves.sh`.
+- `sbin/stop-all.sh` - Stops both the master and the slaves as described above.
Note that these scripts must be executed on the machine you want to run the Spark master on, not your local machine.