aboutsummaryrefslogtreecommitdiff
path: root/sbin/start-slave.sh
diff options
context:
space:
mode:
authorjerryshao <sshao@hortonworks.com>2015-11-04 10:49:34 +0000
committerSean Owen <sowen@cloudera.com>2015-11-04 10:49:34 +0000
commit8aff36e91de0fee2f3f56c6d240bb203b5bb48ba (patch)
tree0afdded361cb75e7658053953abfdb484da78ced /sbin/start-slave.sh
parent2692bdb7dbf36d6247f595d5fd0cb9cda89e1fdd (diff)
downloadspark-8aff36e91de0fee2f3f56c6d240bb203b5bb48ba.tar.gz
spark-8aff36e91de0fee2f3f56c6d240bb203b5bb48ba.tar.bz2
spark-8aff36e91de0fee2f3f56c6d240bb203b5bb48ba.zip
[SPARK-2960][DEPLOY] Support executing Spark from symlinks (reopen)
This PR is based on the work of roji to support running Spark scripts from symlinks. Thanks for the great work roji . Would you mind taking a look at this PR, thanks a lot. For releases like HDP and others, normally it will expose the Spark executables as symlinks and put in `PATH`, but current Spark's scripts do not support finding real path from symlink recursively, this will make spark fail to execute from symlink. This PR try to solve this issue by finding the absolute path from symlink. Instead of using `readlink -f` like what this PR (https://github.com/apache/spark/pull/2386) implemented is that `-f` is not support for Mac, so here manually seeking the path through loop. I've tested with Mac and Linux (Cent OS), looks fine. This PR did not fix the scripts under `sbin` folder, not sure if it needs to be fixed also? Please help to review, any comment is greatly appreciated. Author: jerryshao <sshao@hortonworks.com> Author: Shay Rojansky <roji@roji.org> Closes #8669 from jerryshao/SPARK-2960.
Diffstat (limited to 'sbin/start-slave.sh')
-rwxr-xr-xsbin/start-slave.sh18
1 files changed, 9 insertions, 9 deletions
diff --git a/sbin/start-slave.sh b/sbin/start-slave.sh
index 4c919ff76a..21455648d1 100755
--- a/sbin/start-slave.sh
+++ b/sbin/start-slave.sh
@@ -21,14 +21,14 @@
#
# Environment Variables
#
-# SPARK_WORKER_INSTANCES The number of worker instances to run on this
+# SPARK_WORKER_INSTANCES The number of worker instances to run on this
# slave. Default is 1.
-# SPARK_WORKER_PORT The base port number for the first worker. If set,
+# SPARK_WORKER_PORT The base port number for the first worker. If set,
# subsequent workers will increment this number. If
# unset, Spark will find a valid port number, but
# with no guarantee of a predictable pattern.
# SPARK_WORKER_WEBUI_PORT The base port for the web interface of the first
-# worker. Subsequent workers will increment this
+# worker. Subsequent workers will increment this
# number. Default is 8081.
usage="Usage: start-slave.sh <spark-master-URL> where <spark-master-URL> is like spark://localhost:7077"
@@ -39,12 +39,13 @@ if [ $# -lt 1 ]; then
exit 1
fi
-sbin="`dirname "$0"`"
-sbin="`cd "$sbin"; pwd`"
+if [ -z "${SPARK_HOME}" ]; then
+ export SPARK_HOME="$(cd "`dirname "$0"`"/..; pwd)"
+fi
-. "$sbin/spark-config.sh"
+. "${SPARK_HOME}/sbin/spark-config.sh"
-. "$SPARK_PREFIX/bin/load-spark-env.sh"
+. "${SPARK_HOME}/bin/load-spark-env.sh"
# First argument should be the master; we need to store it aside because we may
# need to insert arguments between it and the other arguments
@@ -71,7 +72,7 @@ function start_instance {
fi
WEBUI_PORT=$(( $SPARK_WORKER_WEBUI_PORT + $WORKER_NUM - 1 ))
- "$sbin"/spark-daemon.sh start org.apache.spark.deploy.worker.Worker $WORKER_NUM \
+ "${SPARK_HOME}/sbin"/spark-daemon.sh start org.apache.spark.deploy.worker.Worker $WORKER_NUM \
--webui-port "$WEBUI_PORT" $PORT_FLAG $PORT_NUM $MASTER "$@"
}
@@ -82,4 +83,3 @@ else
start_instance $(( 1 + $i )) "$@"
done
fi
-