aboutsummaryrefslogtreecommitdiff
path: root/repl
diff options
context:
space:
mode:
authorAndrew Or <andrewor14@gmail.com>2014-08-02 00:45:38 -0700
committerPatrick Wendell <pwendell@gmail.com>2014-08-02 00:45:38 -0700
commit148af6082cdb44840bbd61c7a4f67a95badad10b (patch)
tree8acbf61d0c81122c9d6fb3b18940f5b4047f6689 /repl
parentd934801d53fc2f1d57d3534ae4e1e9384c7dda99 (diff)
downloadspark-148af6082cdb44840bbd61c7a4f67a95badad10b.tar.gz
spark-148af6082cdb44840bbd61c7a4f67a95badad10b.tar.bz2
spark-148af6082cdb44840bbd61c7a4f67a95badad10b.zip
[SPARK-2454] Do not ship spark home to Workers
When standalone Workers launch executors, they inherit the Spark home set by the driver. This means if the worker machines do not share the same directory structure as the driver node, the Workers will attempt to run scripts (e.g. bin/compute-classpath.sh) that do not exist locally and fail. This is a common scenario if the driver is launched from outside of the cluster. The solution is to simply not pass the driver's Spark home to the Workers. This PR further makes an attempt to avoid overloading the usages of `spark.home`, which is now only used for setting executor Spark home on Mesos and in python. This is based on top of #1392 and originally reported by YanTangZhai. Tested on standalone cluster. Author: Andrew Or <andrewor14@gmail.com> Closes #1734 from andrewor14/spark-home-reprise and squashes the following commits: f71f391 [Andrew Or] Revert changes in python 1c2532c [Andrew Or] Merge branch 'master' of github.com:apache/spark into spark-home-reprise 188fc5d [Andrew Or] Avoid using spark.home where possible 09272b7 [Andrew Or] Always use Worker's working directory as spark home
Diffstat (limited to 'repl')
-rw-r--r--repl/src/main/scala/org/apache/spark/repl/SparkILoop.scala3
1 files changed, 0 insertions, 3 deletions
diff --git a/repl/src/main/scala/org/apache/spark/repl/SparkILoop.scala b/repl/src/main/scala/org/apache/spark/repl/SparkILoop.scala
index 42c7e511dc..65788f4646 100644
--- a/repl/src/main/scala/org/apache/spark/repl/SparkILoop.scala
+++ b/repl/src/main/scala/org/apache/spark/repl/SparkILoop.scala
@@ -969,9 +969,6 @@ class SparkILoop(in0: Option[BufferedReader], protected val out: JPrintWriter,
if (execUri != null) {
conf.set("spark.executor.uri", execUri)
}
- if (System.getenv("SPARK_HOME") != null) {
- conf.setSparkHome(System.getenv("SPARK_HOME"))
- }
sparkContext = new SparkContext(conf)
logInfo("Created spark context..")
sparkContext