aboutsummaryrefslogtreecommitdiff
path: root/python/pyspark/sql/context.py
diff options
context:
space:
mode:
authorSandeep Singh <sandeep@techaddict.me>2016-05-11 14:15:18 -0700
committerDavies Liu <davies.liu@gmail.com>2016-05-11 14:15:18 -0700
commitde9c85ccaacd12de9837eb88eae0a7e7ededd679 (patch)
tree06d8e46cfd1483716efc5af27c6669a383ec608a /python/pyspark/sql/context.py
parent40a949aae9c3040019a52482d091912a85b0f4d4 (diff)
downloadspark-de9c85ccaacd12de9837eb88eae0a7e7ededd679.tar.gz
spark-de9c85ccaacd12de9837eb88eae0a7e7ededd679.tar.bz2
spark-de9c85ccaacd12de9837eb88eae0a7e7ededd679.zip
[SPARK-15270] [SQL] Use SparkSession Builder to build a session with HiveSupport
## What changes were proposed in this pull request? Before: Creating a hiveContext was failing ```python from pyspark.sql import HiveContext hc = HiveContext(sc) ``` with ``` Traceback (most recent call last): File "<stdin>", line 1, in <module> File "spark-2.0/python/pyspark/sql/context.py", line 458, in __init__ sparkSession = SparkSession.withHiveSupport(sparkContext) File "spark-2.0/python/pyspark/sql/session.py", line 192, in withHiveSupport jsparkSession = sparkContext._jvm.SparkSession.withHiveSupport(sparkContext._jsc.sc()) File "spark-2.0/python/lib/py4j-0.9.2-src.zip/py4j/java_gateway.py", line 1048, in __getattr__ py4j.protocol.Py4JError: org.apache.spark.sql.SparkSession.withHiveSupport does not exist in the JVM ``` Now: ```python >>> from pyspark.sql import HiveContext >>> hc = HiveContext(sc) >>> hc.range(0, 100) DataFrame[id: bigint] >>> hc.range(0, 100).count() 100 ``` ## How was this patch tested? Existing Tests, tested manually in python shell Author: Sandeep Singh <sandeep@techaddict.me> Closes #13056 from techaddict/SPARK-15270.
Diffstat (limited to 'python/pyspark/sql/context.py')
-rw-r--r--python/pyspark/sql/context.py2
1 files changed, 1 insertions, 1 deletions
diff --git a/python/pyspark/sql/context.py b/python/pyspark/sql/context.py
index 78ab2e81bf..02e742c124 100644
--- a/python/pyspark/sql/context.py
+++ b/python/pyspark/sql/context.py
@@ -455,7 +455,7 @@ class HiveContext(SQLContext):
def __init__(self, sparkContext, jhiveContext=None):
if jhiveContext is None:
- sparkSession = SparkSession.withHiveSupport(sparkContext)
+ sparkSession = SparkSession.builder.enableHiveSupport().getOrCreate()
else:
sparkSession = SparkSession(sparkContext, jhiveContext.sparkSession())
SQLContext.__init__(self, sparkContext, sparkSession, jhiveContext)