aboutsummaryrefslogtreecommitdiff
path: root/python/pyspark/tests.py
diff options
context:
space:
mode:
authorYin Huai <yhuai@databricks.com>2016-06-28 07:54:44 -0700
committerDavies Liu <davies.liu@gmail.com>2016-06-28 07:54:44 -0700
commit0923c4f5676691e28e70ecb05890e123540b91f0 (patch)
treefc4b3fff3d5ab0f07080ec90cabeae9786bae147 /python/pyspark/tests.py
parente158478a9fff5e63ae0336a54b3f360d0cd38921 (diff)
downloadspark-0923c4f5676691e28e70ecb05890e123540b91f0.tar.gz
spark-0923c4f5676691e28e70ecb05890e123540b91f0.tar.bz2
spark-0923c4f5676691e28e70ecb05890e123540b91f0.zip
[SPARK-16224] [SQL] [PYSPARK] SparkSession builder's configs need to be set to the existing Scala SparkContext's SparkConf
## What changes were proposed in this pull request? When we create a SparkSession at the Python side, it is possible that a SparkContext has been created. For this case, we need to set configs of the SparkSession builder to the Scala SparkContext's SparkConf (we need to do so because conf changes on a active Python SparkContext will not be propagated to the JVM side). Otherwise, we may create a wrong SparkSession (e.g. Hive support is not enabled even if enableHiveSupport is called). ## How was this patch tested? New tests and manual tests. Author: Yin Huai <yhuai@databricks.com> Closes #13931 from yhuai/SPARK-16224.
Diffstat (limited to 'python/pyspark/tests.py')
-rw-r--r--python/pyspark/tests.py8
1 files changed, 8 insertions, 0 deletions
diff --git a/python/pyspark/tests.py b/python/pyspark/tests.py
index 222c5ca5f4..0a029b6e74 100644
--- a/python/pyspark/tests.py
+++ b/python/pyspark/tests.py
@@ -1921,6 +1921,14 @@ class ContextTests(unittest.TestCase):
post_parallalize_temp_files = os.listdir(sc._temp_dir)
self.assertEqual(temp_files, post_parallalize_temp_files)
+ def test_set_conf(self):
+ # This is for an internal use case. When there is an existing SparkContext,
+ # SparkSession's builder needs to set configs into SparkContext's conf.
+ sc = SparkContext()
+ sc._conf.set("spark.test.SPARK16224", "SPARK16224")
+ self.assertEqual(sc._jsc.sc().conf().get("spark.test.SPARK16224"), "SPARK16224")
+ sc.stop()
+
def test_stop(self):
sc = SparkContext()
self.assertNotEqual(SparkContext._active_spark_context, None)