diff options
author | Luc Bourlier <luc.bourlier@typesafe.com> | 2015-12-18 16:21:01 -0800 |
---|---|---|
committer | Andrew Or <andrew@databricks.com> | 2015-12-18 16:21:01 -0800 |
commit | ba9332edd889730c906404041bc83b1643d80961 (patch) | |
tree | d9647a62b92bb0225337334e0a463f8002d9925e /core/src/main/scala | |
parent | 007a32f90af1065bfa3ca4cdb194c40c06e87abf (diff) | |
download | spark-ba9332edd889730c906404041bc83b1643d80961.tar.gz spark-ba9332edd889730c906404041bc83b1643d80961.tar.bz2 spark-ba9332edd889730c906404041bc83b1643d80961.zip |
[SPARK-12345][CORE] Do not send SPARK_HOME through Spark submit REST interface
It is usually an invalid location on the remote machine executing the job.
It is picked up by the Mesos support in cluster mode, and most of the time causes
the job to fail.
Fixes SPARK-12345
Author: Luc Bourlier <luc.bourlier@typesafe.com>
Closes #10329 from skyluc/issue/SPARK_HOME.
Diffstat (limited to 'core/src/main/scala')
-rw-r--r-- | core/src/main/scala/org/apache/spark/deploy/rest/RestSubmissionClient.scala | 6 |
1 files changed, 4 insertions, 2 deletions
diff --git a/core/src/main/scala/org/apache/spark/deploy/rest/RestSubmissionClient.scala b/core/src/main/scala/org/apache/spark/deploy/rest/RestSubmissionClient.scala index f0dd667ea1..0744c64d5e 100644 --- a/core/src/main/scala/org/apache/spark/deploy/rest/RestSubmissionClient.scala +++ b/core/src/main/scala/org/apache/spark/deploy/rest/RestSubmissionClient.scala @@ -428,8 +428,10 @@ private[spark] object RestSubmissionClient { * Filter non-spark environment variables from any environment. */ private[rest] def filterSystemEnvironment(env: Map[String, String]): Map[String, String] = { - env.filter { case (k, _) => - (k.startsWith("SPARK_") && k != "SPARK_ENV_LOADED") || k.startsWith("MESOS_") + env.filterKeys { k => + // SPARK_HOME is filtered out because it is usually wrong on the remote machine (SPARK-12345) + (k.startsWith("SPARK_") && k != "SPARK_ENV_LOADED" && k != "SPARK_HOME") || + k.startsWith("MESOS_") } } } |