diff options
author | Holden Karau <holden@pigscanfly.ca> | 2015-09-03 09:30:54 +0100 |
---|---|---|
committer | Sean Owen <sowen@cloudera.com> | 2015-09-03 09:30:54 +0100 |
commit | 67580f1f574d272af3712fd91458f3c87368c2e4 (patch) | |
tree | a6f2326ba254e41c18e12774767c3d2f9300a607 /yarn/src | |
parent | 0349b5b4383cf813bea4e1053bcc4e0268603743 (diff) | |
download | spark-67580f1f574d272af3712fd91458f3c87368c2e4.tar.gz spark-67580f1f574d272af3712fd91458f3c87368c2e4.tar.bz2 spark-67580f1f574d272af3712fd91458f3c87368c2e4.zip |
[SPARK-10332] [CORE] Fix yarn spark executor validation
From Jira:
Running spark-submit with yarn with number-executors equal to 0 when not using dynamic allocation should error out.
In spark 1.5.0 it continues and ends up hanging.
yarn.ClientArguments still has the check so something else must have changed.
spark-submit --master yarn --deploy-mode cluster --class org.apache.spark.examples.SparkPi --num-executors 0 ....
spark 1.4.1 errors with:
java.lang.IllegalArgumentException:
Number of executors was 0, but must be at least 1
(or 0 if dynamic executor allocation is enabled).
Author: Holden Karau <holden@pigscanfly.ca>
Closes #8580 from holdenk/SPARK-10332-spark-submit-to-yarn-executors-0-message.
Diffstat (limited to 'yarn/src')
-rw-r--r-- | yarn/src/main/scala/org/apache/spark/deploy/yarn/ClientArguments.scala | 3 |
1 files changed, 3 insertions, 0 deletions
diff --git a/yarn/src/main/scala/org/apache/spark/deploy/yarn/ClientArguments.scala b/yarn/src/main/scala/org/apache/spark/deploy/yarn/ClientArguments.scala index 4f42ffefa7..54f62e6b72 100644 --- a/yarn/src/main/scala/org/apache/spark/deploy/yarn/ClientArguments.scala +++ b/yarn/src/main/scala/org/apache/spark/deploy/yarn/ClientArguments.scala @@ -96,6 +96,9 @@ private[spark] class ClientArguments(args: Array[String], sparkConf: SparkConf) } numExecutors = initialNumExecutors + } else { + val numExecutorsConf = "spark.executor.instances" + numExecutors = sparkConf.getInt(numExecutorsConf, numExecutors) } principal = Option(principal) .orElse(sparkConf.getOption("spark.yarn.principal")) |