aboutsummaryrefslogtreecommitdiff
path: root/bin/pyspark
diff options
context:
space:
mode:
authorAaron Davidson <aaron@databricks.com>2014-03-24 22:24:21 -0700
committerAaron Davidson <aaron@databricks.com>2014-03-24 22:24:21 -0700
commit007a733434aa39cdb137ab9795434ae2af70fe0b (patch)
tree431f3cfd674c8e825c51bd6a0ecda8d3cc19ab1f /bin/pyspark
parentb637f2d91ab4d3d5bf13e8d959c919ebd776f6af (diff)
downloadspark-007a733434aa39cdb137ab9795434ae2af70fe0b.tar.gz
spark-007a733434aa39cdb137ab9795434ae2af70fe0b.tar.bz2
spark-007a733434aa39cdb137ab9795434ae2af70fe0b.zip
SPARK-1286: Make usage of spark-env.sh idempotent
Various spark scripts load spark-env.sh. This can cause growth of any variables that may be appended to (SPARK_CLASSPATH, SPARK_REPL_OPTS) and it makes the precedence order for options specified in spark-env.sh less clear. One use-case for the latter is that we want to set options from the command-line of spark-shell, but these options will be overridden by subsequent loading of spark-env.sh. If we were to load the spark-env.sh first and then set our command-line options, we could guarantee correct precedence order. Note that we use SPARK_CONF_DIR if available to support the sbin/ scripts, which always set this variable from sbin/spark-config.sh. Otherwise, we default to the ../conf/ as usual. Author: Aaron Davidson <aaron@databricks.com> Closes #184 from aarondav/idem and squashes the following commits: e291f91 [Aaron Davidson] Use "private" variables in load-spark-env.sh 8da8360 [Aaron Davidson] Add .sh extension to load-spark-env.sh 93a2471 [Aaron Davidson] SPARK-1286: Make usage of spark-env.sh idempotent
Diffstat (limited to 'bin/pyspark')
-rwxr-xr-xbin/pyspark5
1 files changed, 1 insertions, 4 deletions
diff --git a/bin/pyspark b/bin/pyspark
index ed6f8da730..67e1f61eeb 100755
--- a/bin/pyspark
+++ b/bin/pyspark
@@ -36,10 +36,7 @@ if [ ! -f "$FWDIR/RELEASE" ]; then
fi
fi
-# Load environment variables from conf/spark-env.sh, if it exists
-if [ -e "$FWDIR/conf/spark-env.sh" ] ; then
- . $FWDIR/conf/spark-env.sh
-fi
+. $FWDIR/bin/load-spark-env.sh
# Figure out which Python executable to use
if [ -z "$PYSPARK_PYTHON" ] ; then