diff options
author | Herman van Hovell <hvanhovell@databricks.com> | 2017-04-20 22:37:04 +0200 |
---|---|---|
committer | Herman van Hovell <hvanhovell@databricks.com> | 2017-04-20 22:37:04 +0200 |
commit | 033206355339677812a250b2b64818a261871fd2 (patch) | |
tree | d000d6c55f08f9454e87e13cea74c187593fdd20 /sql/core/src/test/scala/org/apache/spark/sql/execution/datasources/FileSourceStrategySuite.scala | |
parent | d95e4d9d6a9705c534549add6d4a73d554e47274 (diff) | |
download | spark-033206355339677812a250b2b64818a261871fd2.tar.gz spark-033206355339677812a250b2b64818a261871fd2.tar.bz2 spark-033206355339677812a250b2b64818a261871fd2.zip |
[SPARK-20410][SQL] Make sparkConf a def in SharedSQLContext
## What changes were proposed in this pull request?
It is kind of annoying that `SharedSQLContext.sparkConf` is a val when overriding test cases, because you cannot call `super` on it. This PR makes it a function.
## How was this patch tested?
Existing tests.
Author: Herman van Hovell <hvanhovell@databricks.com>
Closes #17705 from hvanhovell/SPARK-20410.
Diffstat (limited to 'sql/core/src/test/scala/org/apache/spark/sql/execution/datasources/FileSourceStrategySuite.scala')
-rw-r--r-- | sql/core/src/test/scala/org/apache/spark/sql/execution/datasources/FileSourceStrategySuite.scala | 2 |
1 files changed, 1 insertions, 1 deletions
diff --git a/sql/core/src/test/scala/org/apache/spark/sql/execution/datasources/FileSourceStrategySuite.scala b/sql/core/src/test/scala/org/apache/spark/sql/execution/datasources/FileSourceStrategySuite.scala index f36162858b..8703fe96e5 100644 --- a/sql/core/src/test/scala/org/apache/spark/sql/execution/datasources/FileSourceStrategySuite.scala +++ b/sql/core/src/test/scala/org/apache/spark/sql/execution/datasources/FileSourceStrategySuite.scala @@ -42,7 +42,7 @@ import org.apache.spark.util.Utils class FileSourceStrategySuite extends QueryTest with SharedSQLContext with PredicateHelper { import testImplicits._ - protected override val sparkConf = new SparkConf().set("spark.default.parallelism", "1") + protected override def sparkConf = super.sparkConf.set("spark.default.parallelism", "1") test("unpartitioned table, single partition") { val table = |