aboutsummaryrefslogtreecommitdiff
path: root/sbin/spark-config.sh
diff options
context:
space:
mode:
authorPrashant Sharma <prashant.s@imaginea.com>2014-09-08 10:24:15 -0700
committerAndrew Or <andrewor14@gmail.com>2014-09-08 10:24:15 -0700
commite16a8e7db5a3b1065b14baf89cb723a59b99226b (patch)
tree09d5b9bd510325047aa20f62f215184e46367bdb /sbin/spark-config.sh
parent711356b422c66e2a80377a9f43fce97282460520 (diff)
downloadspark-e16a8e7db5a3b1065b14baf89cb723a59b99226b.tar.gz
spark-e16a8e7db5a3b1065b14baf89cb723a59b99226b.tar.bz2
spark-e16a8e7db5a3b1065b14baf89cb723a59b99226b.zip
SPARK-3337 Paranoid quoting in shell to allow install dirs with spaces within.
... Tested ! TBH, it isn't a great idea to have directory with spaces within. Because emacs doesn't like it then hadoop doesn't like it. and so on... Author: Prashant Sharma <prashant.s@imaginea.com> Closes #2229 from ScrapCodes/SPARK-3337/quoting-shell-scripts and squashes the following commits: d4ad660 [Prashant Sharma] SPARK-3337 Paranoid quoting in shell to allow install dirs with spaces within.
Diffstat (limited to 'sbin/spark-config.sh')
-rwxr-xr-xsbin/spark-config.sh16
1 files changed, 8 insertions, 8 deletions
diff --git a/sbin/spark-config.sh b/sbin/spark-config.sh
index 5c87da5815..2718d6cba1 100755
--- a/sbin/spark-config.sh
+++ b/sbin/spark-config.sh
@@ -21,19 +21,19 @@
# resolve links - $0 may be a softlink
this="${BASH_SOURCE-$0}"
-common_bin=$(cd -P -- "$(dirname -- "$this")" && pwd -P)
+common_bin="$(cd -P -- "$(dirname -- "$this")" && pwd -P)"
script="$(basename -- "$this")"
this="$common_bin/$script"
# convert relative path to absolute path
-config_bin=`dirname "$this"`
-script=`basename "$this"`
-config_bin=`cd "$config_bin"; pwd`
+config_bin="`dirname "$this"`"
+script="`basename "$this"`"
+config_bin="`cd "$config_bin"; pwd`"
this="$config_bin/$script"
-export SPARK_PREFIX=`dirname "$this"`/..
-export SPARK_HOME=${SPARK_PREFIX}
+export SPARK_PREFIX="`dirname "$this"`"/..
+export SPARK_HOME="${SPARK_PREFIX}"
export SPARK_CONF_DIR="$SPARK_HOME/conf"
# Add the PySpark classes to the PYTHONPATH:
-export PYTHONPATH=$SPARK_HOME/python:$PYTHONPATH
-export PYTHONPATH=$SPARK_HOME/python/lib/py4j-0.8.2.1-src.zip:$PYTHONPATH
+export PYTHONPATH="$SPARK_HOME/python:$PYTHONPATH"
+export PYTHONPATH="$SPARK_HOME/python/lib/py4j-0.8.2.1-src.zip:$PYTHONPATH"