diff options
author | Ahir Reddy <ahirreddy@gmail.com> | 2014-02-20 21:20:39 -0800 |
---|---|---|
committer | Matei Zaharia <matei@databricks.com> | 2014-02-20 21:20:39 -0800 |
commit | 59b1379594360636e97511982c794bcf36225e1a (patch) | |
tree | ba453324cc601a6f388df2ec065b37479de7d06d /python/pyspark/conf.py | |
parent | 3fede4831eeb7d36d4f8fa4aaa02ad0cc8b4b09e (diff) | |
download | spark-59b1379594360636e97511982c794bcf36225e1a.tar.gz spark-59b1379594360636e97511982c794bcf36225e1a.tar.bz2 spark-59b1379594360636e97511982c794bcf36225e1a.zip |
SPARK-1114: Allow PySpark to use existing JVM and Gateway
Patch to allow PySpark to use existing JVM and Gateway. Changes to PySpark implementation of SparkConf to take existing SparkConf JVM handle. Change to PySpark SparkContext to allow subclass specific context initialization.
Author: Ahir Reddy <ahirreddy@gmail.com>
Closes #622 from ahirreddy/pyspark-existing-jvm and squashes the following commits:
a86f457 [Ahir Reddy] Patch to allow PySpark to use existing JVM and Gateway. Changes to PySpark implementation of SparkConf to take existing SparkConf JVM handle. Change to PySpark SparkContext to allow subclass specific context initialization.
Diffstat (limited to 'python/pyspark/conf.py')
-rw-r--r-- | python/pyspark/conf.py | 15 |
1 files changed, 10 insertions, 5 deletions
diff --git a/python/pyspark/conf.py b/python/pyspark/conf.py index 3870cd8f2b..49b68d57ab 100644 --- a/python/pyspark/conf.py +++ b/python/pyspark/conf.py @@ -75,7 +75,7 @@ class SparkConf(object): and can no longer be modified by the user. """ - def __init__(self, loadDefaults=True, _jvm=None): + def __init__(self, loadDefaults=True, _jvm=None, _jconf=None): """ Create a new Spark configuration. @@ -83,11 +83,16 @@ class SparkConf(object): properties (True by default) @param _jvm: internal parameter used to pass a handle to the Java VM; does not need to be set by users + @param _jconf: Optionally pass in an existing SparkConf handle + to use its parameters """ - from pyspark.context import SparkContext - SparkContext._ensure_initialized() - _jvm = _jvm or SparkContext._jvm - self._jconf = _jvm.SparkConf(loadDefaults) + if _jconf: + self._jconf = _jconf + else: + from pyspark.context import SparkContext + SparkContext._ensure_initialized() + _jvm = _jvm or SparkContext._jvm + self._jconf = _jvm.SparkConf(loadDefaults) def set(self, key, value): """Set a configuration property.""" |