aboutsummaryrefslogtreecommitdiff
path: root/core/src
diff options
context:
space:
mode:
authorJeff Zhang <zjffdu@apache.org>2016-05-25 10:46:51 -0700
committerAndrew Or <andrew@databricks.com>2016-05-25 10:46:51 -0700
commit01e7b9c85bb84924e279021f9748774dce9702c8 (patch)
tree8c360803e35f42b64c9a98f3baa7eaad4a4f5eb1 /core/src
parentb120fba6ae26186b3fa0dfbb1637046f4e76c2b0 (diff)
downloadspark-01e7b9c85bb84924e279021f9748774dce9702c8.tar.gz
spark-01e7b9c85bb84924e279021f9748774dce9702c8.tar.bz2
spark-01e7b9c85bb84924e279021f9748774dce9702c8.zip
[SPARK-15345][SQL][PYSPARK] SparkSession's conf doesn't take effect when this already an existing SparkContext
## What changes were proposed in this pull request? Override the existing SparkContext is the provided SparkConf is different. PySpark part hasn't been fixed yet, will do that after the first round of review to ensure this is the correct approach. ## How was this patch tested? Manually verify it in spark-shell. rxin Please help review it, I think this is a very critical issue for spark 2.0 Author: Jeff Zhang <zjffdu@apache.org> Closes #13160 from zjffdu/SPARK-15345.
Diffstat (limited to 'core/src')
-rw-r--r--core/src/main/scala/org/apache/spark/SparkContext.scala3
1 files changed, 3 insertions, 0 deletions
diff --git a/core/src/main/scala/org/apache/spark/SparkContext.scala b/core/src/main/scala/org/apache/spark/SparkContext.scala
index 36aa3becb4..5018eb38d9 100644
--- a/core/src/main/scala/org/apache/spark/SparkContext.scala
+++ b/core/src/main/scala/org/apache/spark/SparkContext.scala
@@ -2254,6 +2254,9 @@ object SparkContext extends Logging {
if (activeContext.get() == null) {
setActiveContext(new SparkContext(config), allowMultipleContexts = false)
}
+ if (config.getAll.nonEmpty) {
+ logWarning("Use an existing SparkContext, some configuration may not take effect.")
+ }
activeContext.get()
}
}