aboutsummaryrefslogtreecommitdiff
path: root/project
diff options
context:
space:
mode:
authorAndrew Or <andrewor14@gmail.com>2014-08-02 00:45:38 -0700
committerPatrick Wendell <pwendell@gmail.com>2014-08-02 00:45:38 -0700
commit148af6082cdb44840bbd61c7a4f67a95badad10b (patch)
tree8acbf61d0c81122c9d6fb3b18940f5b4047f6689 /project
parentd934801d53fc2f1d57d3534ae4e1e9384c7dda99 (diff)
downloadspark-148af6082cdb44840bbd61c7a4f67a95badad10b.tar.gz
spark-148af6082cdb44840bbd61c7a4f67a95badad10b.tar.bz2
spark-148af6082cdb44840bbd61c7a4f67a95badad10b.zip
[SPARK-2454] Do not ship spark home to Workers
When standalone Workers launch executors, they inherit the Spark home set by the driver. This means if the worker machines do not share the same directory structure as the driver node, the Workers will attempt to run scripts (e.g. bin/compute-classpath.sh) that do not exist locally and fail. This is a common scenario if the driver is launched from outside of the cluster. The solution is to simply not pass the driver's Spark home to the Workers. This PR further makes an attempt to avoid overloading the usages of `spark.home`, which is now only used for setting executor Spark home on Mesos and in python. This is based on top of #1392 and originally reported by YanTangZhai. Tested on standalone cluster. Author: Andrew Or <andrewor14@gmail.com> Closes #1734 from andrewor14/spark-home-reprise and squashes the following commits: f71f391 [Andrew Or] Revert changes in python 1c2532c [Andrew Or] Merge branch 'master' of github.com:apache/spark into spark-home-reprise 188fc5d [Andrew Or] Avoid using spark.home where possible 09272b7 [Andrew Or] Always use Worker's working directory as spark home
Diffstat (limited to 'project')
-rw-r--r--project/SparkBuild.scala2
1 files changed, 1 insertions, 1 deletions
diff --git a/project/SparkBuild.scala b/project/SparkBuild.scala
index a8bbd55861..1d7cc6dd6a 100644
--- a/project/SparkBuild.scala
+++ b/project/SparkBuild.scala
@@ -328,7 +328,7 @@ object TestSettings {
lazy val settings = Seq (
// Fork new JVMs for tests and set Java options for those
fork := true,
- javaOptions in Test += "-Dspark.home=" + sparkHome,
+ javaOptions in Test += "-Dspark.test.home=" + sparkHome,
javaOptions in Test += "-Dspark.testing=1",
javaOptions in Test += "-Dsun.io.serialization.extendedDebugInfo=true",
javaOptions in Test ++= System.getProperties.filter(_._1 startsWith "spark")