aboutsummaryrefslogtreecommitdiff
path: root/mllib/src/test/scala/org
diff options
context:
space:
mode:
authorXiangrui Meng <meng@databricks.com>2015-11-13 13:09:28 -0800
committerXiangrui Meng <meng@databricks.com>2015-11-13 13:09:28 -0800
commit2d2411faa2dd1b7312c4277b2dd9e5678195cfbb (patch)
tree512b18e49f95ef69316b91f8436a08dd6f30a7e3 /mllib/src/test/scala/org
parentd7b2b97ad67f9700fb8c13422c399f2edb72f770 (diff)
downloadspark-2d2411faa2dd1b7312c4277b2dd9e5678195cfbb.tar.gz
spark-2d2411faa2dd1b7312c4277b2dd9e5678195cfbb.tar.bz2
spark-2d2411faa2dd1b7312c4277b2dd9e5678195cfbb.zip
[SPARK-11672][ML] Set active SQLContext in MLlibTestSparkContext.beforeAll
Still saw some error messages caused by `SQLContext.getOrCreate`: https://amplab.cs.berkeley.edu/jenkins/job/Spark-Master-SBT/3997/AMPLAB_JENKINS_BUILD_PROFILE=hadoop2.3,label=spark-test/testReport/junit/org.apache.spark.ml.util/JavaDefaultReadWriteSuite/testDefaultReadWrite/ This PR sets the active SQLContext in beforeAll, which is not automatically set in `new SQLContext`. This makes `SQLContext.getOrCreate` return the right SQLContext. cc: yhuai Author: Xiangrui Meng <meng@databricks.com> Closes #9694 from mengxr/SPARK-11672.3.
Diffstat (limited to 'mllib/src/test/scala/org')
-rw-r--r--mllib/src/test/scala/org/apache/spark/mllib/util/MLlibTestSparkContext.scala1
1 files changed, 1 insertions, 0 deletions
diff --git a/mllib/src/test/scala/org/apache/spark/mllib/util/MLlibTestSparkContext.scala b/mllib/src/test/scala/org/apache/spark/mllib/util/MLlibTestSparkContext.scala
index 998ee48186..378139593b 100644
--- a/mllib/src/test/scala/org/apache/spark/mllib/util/MLlibTestSparkContext.scala
+++ b/mllib/src/test/scala/org/apache/spark/mllib/util/MLlibTestSparkContext.scala
@@ -34,6 +34,7 @@ trait MLlibTestSparkContext extends BeforeAndAfterAll { self: Suite =>
sc = new SparkContext(conf)
SQLContext.clearActive()
sqlContext = new SQLContext(sc)
+ SQLContext.setActive(sqlContext)
}
override def afterAll() {