aboutsummaryrefslogtreecommitdiff
path: root/mllib/src/main
diff options
context:
space:
mode:
authorXiangrui Meng <meng@databricks.com>2015-11-13 13:09:28 -0800
committerXiangrui Meng <meng@databricks.com>2015-11-13 13:09:28 -0800
commit2d2411faa2dd1b7312c4277b2dd9e5678195cfbb (patch)
tree512b18e49f95ef69316b91f8436a08dd6f30a7e3 /mllib/src/main
parentd7b2b97ad67f9700fb8c13422c399f2edb72f770 (diff)
downloadspark-2d2411faa2dd1b7312c4277b2dd9e5678195cfbb.tar.gz
spark-2d2411faa2dd1b7312c4277b2dd9e5678195cfbb.tar.bz2
spark-2d2411faa2dd1b7312c4277b2dd9e5678195cfbb.zip
[SPARK-11672][ML] Set active SQLContext in MLlibTestSparkContext.beforeAll
Still saw some error messages caused by `SQLContext.getOrCreate`: https://amplab.cs.berkeley.edu/jenkins/job/Spark-Master-SBT/3997/AMPLAB_JENKINS_BUILD_PROFILE=hadoop2.3,label=spark-test/testReport/junit/org.apache.spark.ml.util/JavaDefaultReadWriteSuite/testDefaultReadWrite/ This PR sets the active SQLContext in beforeAll, which is not automatically set in `new SQLContext`. This makes `SQLContext.getOrCreate` return the right SQLContext. cc: yhuai Author: Xiangrui Meng <meng@databricks.com> Closes #9694 from mengxr/SPARK-11672.3.
Diffstat (limited to 'mllib/src/main')
-rw-r--r--mllib/src/main/scala/org/apache/spark/ml/util/ReadWrite.scala7
1 files changed, 5 insertions, 2 deletions
diff --git a/mllib/src/main/scala/org/apache/spark/ml/util/ReadWrite.scala b/mllib/src/main/scala/org/apache/spark/ml/util/ReadWrite.scala
index 85f888c9f2..ca896ed610 100644
--- a/mllib/src/main/scala/org/apache/spark/ml/util/ReadWrite.scala
+++ b/mllib/src/main/scala/org/apache/spark/ml/util/ReadWrite.scala
@@ -48,8 +48,11 @@ private[util] sealed trait BaseReadWrite {
/**
* Returns the user-specified SQL context or the default.
*/
- protected final def sqlContext: SQLContext = optionSQLContext.getOrElse {
- SQLContext.getOrCreate(SparkContext.getOrCreate())
+ protected final def sqlContext: SQLContext = {
+ if (optionSQLContext.isEmpty) {
+ optionSQLContext = Some(SQLContext.getOrCreate(SparkContext.getOrCreate()))
+ }
+ optionSQLContext.get
}
/** Returns the [[SparkContext]] underlying [[sqlContext]] */