diff options
author | Yuming Wang <wgyumg@gmail.com> | 2017-03-08 11:31:01 +0000 |
---|---|---|
committer | Sean Owen <sowen@cloudera.com> | 2017-03-08 11:31:01 +0000 |
commit | 3f9f9180c2e695ad468eb813df5feec41e169531 (patch) | |
tree | f1d56e7a5b51beabaa340f326ba764466ebd507a /sql/core/src/test | |
parent | 81303f7ca7808d51229411dce8feeed8c23dbe15 (diff) | |
download | spark-3f9f9180c2e695ad468eb813df5feec41e169531.tar.gz spark-3f9f9180c2e695ad468eb813df5feec41e169531.tar.bz2 spark-3f9f9180c2e695ad468eb813df5feec41e169531.zip |
[SPARK-19693][SQL] Make the SET mapreduce.job.reduces automatically converted to spark.sql.shuffle.partitions
## What changes were proposed in this pull request?
Make the `SET mapreduce.job.reduces` automatically converted to `spark.sql.shuffle.partitions`, it's similar to `SET mapred.reduce.tasks`.
## How was this patch tested?
unit tests
Author: Yuming Wang <wgyumg@gmail.com>
Closes #17020 from wangyum/SPARK-19693.
Diffstat (limited to 'sql/core/src/test')
-rw-r--r-- | sql/core/src/test/scala/org/apache/spark/sql/SQLQuerySuite.scala | 12 |
1 files changed, 12 insertions, 0 deletions
diff --git a/sql/core/src/test/scala/org/apache/spark/sql/SQLQuerySuite.scala b/sql/core/src/test/scala/org/apache/spark/sql/SQLQuerySuite.scala index 468ea05512..d9e0196c57 100644 --- a/sql/core/src/test/scala/org/apache/spark/sql/SQLQuerySuite.scala +++ b/sql/core/src/test/scala/org/apache/spark/sql/SQLQuerySuite.scala @@ -1019,6 +1019,18 @@ class SQLQuerySuite extends QueryTest with SharedSQLContext { spark.sessionState.conf.clear() } + test("SET mapreduce.job.reduces automatically converted to spark.sql.shuffle.partitions") { + spark.sessionState.conf.clear() + val before = spark.conf.get(SQLConf.SHUFFLE_PARTITIONS.key).toInt + val newConf = before + 1 + sql(s"SET mapreduce.job.reduces=${newConf.toString}") + val after = spark.conf.get(SQLConf.SHUFFLE_PARTITIONS.key).toInt + assert(before != after) + assert(newConf === after) + intercept[IllegalArgumentException](sql(s"SET mapreduce.job.reduces=-1")) + spark.sessionState.conf.clear() + } + test("apply schema") { val schema1 = StructType( StructField("f1", IntegerType, false) :: |