aboutsummaryrefslogtreecommitdiff
path: root/dev
diff options
context:
space:
mode:
authorAndrew Or <andrew@databricks.com>2016-04-25 13:23:05 -0700
committerAndrew Or <andrew@databricks.com>2016-04-25 13:23:05 -0700
commit3c5e65c339a9b4d5e01375d7f073e444898d34c8 (patch)
tree039f7e382124f03495e9b22cdc00df7791affeb7 /dev
parent6bfe42a3be4fbf8bc6f93a4709038fda8ad0610b (diff)
downloadspark-3c5e65c339a9b4d5e01375d7f073e444898d34c8.tar.gz
spark-3c5e65c339a9b4d5e01375d7f073e444898d34c8.tar.bz2
spark-3c5e65c339a9b4d5e01375d7f073e444898d34c8.zip
[SPARK-14721][SQL] Remove HiveContext (part 2)
## What changes were proposed in this pull request? This removes the class `HiveContext` itself along with all code usages associated with it. The bulk of the work was already done in #12485. This is mainly just code cleanup and actually removing the class. Note: A couple of things will break after this patch. These will be fixed separately. - the python HiveContext - all the documentation / comments referencing HiveContext - there will be no more HiveContext in the REPL (fixed by #12589) ## How was this patch tested? No change in functionality. Author: Andrew Or <andrew@databricks.com> Closes #12585 from andrewor14/delete-hive-context.
Diffstat (limited to 'dev')
-rw-r--r--dev/audit-release/sbt_app_hive/src/main/scala/HiveApp.scala8
1 files changed, 3 insertions, 5 deletions
diff --git a/dev/audit-release/sbt_app_hive/src/main/scala/HiveApp.scala b/dev/audit-release/sbt_app_hive/src/main/scala/HiveApp.scala
index 4a980ec071..f69d46cd17 100644
--- a/dev/audit-release/sbt_app_hive/src/main/scala/HiveApp.scala
+++ b/dev/audit-release/sbt_app_hive/src/main/scala/HiveApp.scala
@@ -20,10 +20,8 @@ package main.scala
import scala.collection.mutable.{ListBuffer, Queue}
-import org.apache.spark.SparkConf
-import org.apache.spark.SparkContext
+import org.apache.spark.{SparkConf, SparkContext, SparkSession}
import org.apache.spark.rdd.RDD
-import org.apache.spark.sql.hive.HiveContext
case class Person(name: String, age: Int)
@@ -35,9 +33,9 @@ object SparkSqlExample {
case None => new SparkConf().setAppName("Simple Sql App")
}
val sc = new SparkContext(conf)
- val hiveContext = new HiveContext(sc)
+ val sparkSession = SparkSession.withHiveSupport(sc)
- import hiveContext._
+ import sparkSession._
sql("DROP TABLE IF EXISTS src")
sql("CREATE TABLE IF NOT EXISTS src (key INT, value STRING)")
sql("LOAD DATA LOCAL INPATH 'data.txt' INTO TABLE src")