aboutsummaryrefslogtreecommitdiff
path: root/sql
diff options
context:
space:
mode:
authorDongjoon Hyun <dongjoon@apache.org>2017-01-10 10:49:44 -0800
committerShixiong Zhu <shixiong@databricks.com>2017-01-10 10:49:44 -0800
commitd5b1dc934a2482886c2c095de90e4c6a49ec42bd (patch)
tree2def6cf9c3fe425933b16918b07344f4394ea553 /sql
parent3ef183a941d45b2f7ad167ea5133a93de0da5176 (diff)
downloadspark-d5b1dc934a2482886c2c095de90e4c6a49ec42bd.tar.gz
spark-d5b1dc934a2482886c2c095de90e4c6a49ec42bd.tar.bz2
spark-d5b1dc934a2482886c2c095de90e4c6a49ec42bd.zip
[SPARK-19137][SQL] Fix `withSQLConf` to reset `OptionalConfigEntry` correctly
## What changes were proposed in this pull request? `DataStreamReaderWriterSuite` makes test files in source folder like the followings. Interestingly, the root cause is `withSQLConf` fails to reset `OptionalConfigEntry` correctly. In other words, it resets the config into `Some(undefined)`. ```bash $ git status Untracked files: (use "git add <file>..." to include in what will be committed) sql/core/%253Cundefined%253E/ sql/core/%3Cundefined%3E/ ``` ## How was this patch tested? Manual. ``` build/sbt "project sql" test git status ``` Author: Dongjoon Hyun <dongjoon@apache.org> Closes #16522 from dongjoon-hyun/SPARK-19137.
Diffstat (limited to 'sql')
-rw-r--r--sql/core/src/test/scala/org/apache/spark/sql/test/SQLTestUtils.scala8
1 files changed, 7 insertions, 1 deletions
diff --git a/sql/core/src/test/scala/org/apache/spark/sql/test/SQLTestUtils.scala b/sql/core/src/test/scala/org/apache/spark/sql/test/SQLTestUtils.scala
index d4d8e3e4e8..d4afb9d8af 100644
--- a/sql/core/src/test/scala/org/apache/spark/sql/test/SQLTestUtils.scala
+++ b/sql/core/src/test/scala/org/apache/spark/sql/test/SQLTestUtils.scala
@@ -94,7 +94,13 @@ private[sql] trait SQLTestUtils
*/
protected def withSQLConf(pairs: (String, String)*)(f: => Unit): Unit = {
val (keys, values) = pairs.unzip
- val currentValues = keys.map(key => Try(spark.conf.get(key)).toOption)
+ val currentValues = keys.map { key =>
+ if (spark.conf.contains(key)) {
+ Some(spark.conf.get(key))
+ } else {
+ None
+ }
+ }
(keys, values).zipped.foreach(spark.conf.set)
try f finally {
keys.zip(currentValues).foreach {