diff options
author | Andrew Or <andrew@databricks.com> | 2016-04-25 13:23:05 -0700 |
---|---|---|
committer | Andrew Or <andrew@databricks.com> | 2016-04-25 13:23:05 -0700 |
commit | 3c5e65c339a9b4d5e01375d7f073e444898d34c8 (patch) | |
tree | 039f7e382124f03495e9b22cdc00df7791affeb7 /sql/core/src | |
parent | 6bfe42a3be4fbf8bc6f93a4709038fda8ad0610b (diff) | |
download | spark-3c5e65c339a9b4d5e01375d7f073e444898d34c8.tar.gz spark-3c5e65c339a9b4d5e01375d7f073e444898d34c8.tar.bz2 spark-3c5e65c339a9b4d5e01375d7f073e444898d34c8.zip |
[SPARK-14721][SQL] Remove HiveContext (part 2)
## What changes were proposed in this pull request?
This removes the class `HiveContext` itself along with all code usages associated with it. The bulk of the work was already done in #12485. This is mainly just code cleanup and actually removing the class.
Note: A couple of things will break after this patch. These will be fixed separately.
- the python HiveContext
- all the documentation / comments referencing HiveContext
- there will be no more HiveContext in the REPL (fixed by #12589)
## How was this patch tested?
No change in functionality.
Author: Andrew Or <andrew@databricks.com>
Closes #12585 from andrewor14/delete-hive-context.
Diffstat (limited to 'sql/core/src')
-rw-r--r-- | sql/core/src/main/scala/org/apache/spark/sql/SparkSession.scala | 8 |
1 files changed, 7 insertions, 1 deletions
diff --git a/sql/core/src/main/scala/org/apache/spark/sql/SparkSession.scala b/sql/core/src/main/scala/org/apache/spark/sql/SparkSession.scala index 5c8742d1d8..131f28f98b 100644 --- a/sql/core/src/main/scala/org/apache/spark/sql/SparkSession.scala +++ b/sql/core/src/main/scala/org/apache/spark/sql/SparkSession.scala @@ -905,7 +905,7 @@ class SparkSession private( } -private object SparkSession { +object SparkSession { private def sharedStateClassName(conf: SparkConf): String = { conf.get(CATALOG_IMPLEMENTATION) match { @@ -938,4 +938,10 @@ private object SparkSession { } } + // TODO: do we want to expose this? + def withHiveSupport(sc: SparkContext): SparkSession = { + sc.conf.set(CATALOG_IMPLEMENTATION.key, "hive") + new SparkSession(sc) + } + } |