aboutsummaryrefslogtreecommitdiff
diff options
context:
space:
mode:
authorWangTaoTheTonic <barneystinson@aliyun.com>2014-12-04 11:52:47 -0800
committerJosh Rosen <joshrosen@databricks.com>2014-12-04 11:53:23 -0800
commit8106b1e36b2c2b9f5dc5d7252540e48cc3fc96d5 (patch)
tree4fe2d200d3e34cf2516425cb73349eb5d3ac76e6
parent28c7acacef974fdabd2b9ecc20d0d6cf6c58728f (diff)
downloadspark-8106b1e36b2c2b9f5dc5d7252540e48cc3fc96d5.tar.gz
spark-8106b1e36b2c2b9f5dc5d7252540e48cc3fc96d5.tar.bz2
spark-8106b1e36b2c2b9f5dc5d7252540e48cc3fc96d5.zip
[SPARK-4253] Ignore spark.driver.host in yarn-cluster and standalone-cluster modes
In yarn-cluster and standalone-cluster modes, we don't know where driver will run until it is launched. If the `spark.driver.host` property is set on the submitting machine and propagated to the driver through SparkConf then this will lead to errors when the driver launches. This patch fixes this issue by dropping the `spark.driver.host` property in SparkSubmit when running in a cluster deploy mode. Author: WangTaoTheTonic <barneystinson@aliyun.com> Author: WangTao <barneystinson@aliyun.com> Closes #3112 from WangTaoTheTonic/SPARK4253 and squashes the following commits: ed1a25c [WangTaoTheTonic] revert unrelated formatting issue 02c4e49 [WangTao] add comment 32a3f3f [WangTaoTheTonic] ingore it in SparkSubmit instead of SparkContext 667cf24 [WangTaoTheTonic] document fix ff8d5f7 [WangTaoTheTonic] also ignore it in standalone cluster mode 2286e6b [WangTao] ignore spark.driver.host in yarn-cluster mode
-rw-r--r--core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala5
-rw-r--r--yarn/common/src/main/scala/org/apache/spark/deploy/yarn/ApplicationMasterArguments.scala2
2 files changed, 6 insertions, 1 deletions
diff --git a/core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala b/core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala
index 0c7d247519..955cbd6dab 100644
--- a/core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala
+++ b/core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala
@@ -281,6 +281,11 @@ object SparkSubmit {
sysProps.getOrElseUpdate(k, v)
}
+ // Ignore invalid spark.driver.host in cluster modes.
+ if (deployMode == CLUSTER) {
+ sysProps -= ("spark.driver.host")
+ }
+
// Resolve paths in certain spark properties
val pathConfigs = Seq(
"spark.jars",
diff --git a/yarn/common/src/main/scala/org/apache/spark/deploy/yarn/ApplicationMasterArguments.scala b/yarn/common/src/main/scala/org/apache/spark/deploy/yarn/ApplicationMasterArguments.scala
index 8b32c76d14..d76a63276d 100644
--- a/yarn/common/src/main/scala/org/apache/spark/deploy/yarn/ApplicationMasterArguments.scala
+++ b/yarn/common/src/main/scala/org/apache/spark/deploy/yarn/ApplicationMasterArguments.scala
@@ -36,7 +36,7 @@ class ApplicationMasterArguments(val args: Array[String]) {
var args = inputArgs
- while (! args.isEmpty) {
+ while (!args.isEmpty) {
// --num-workers, --worker-memory, and --worker-cores are deprecated since 1.0,
// the properties with executor in their names are preferred.
args match {