diff options
author | Reynold Xin <rxin@databricks.com> | 2016-05-26 13:03:07 -0700 |
---|---|---|
committer | Andrew Or <andrew@databricks.com> | 2016-05-26 13:03:07 -0700 |
commit | 0f61d6efb45b9ee94fa663f67c4489fbdae2eded (patch) | |
tree | e54986e0a5f3671d50b7aef91f2f28efc4846c3f /sql/hive-thriftserver | |
parent | 594a1bf200fea8d6bcf25839a49186f66f922bc8 (diff) | |
download | spark-0f61d6efb45b9ee94fa663f67c4489fbdae2eded.tar.gz spark-0f61d6efb45b9ee94fa663f67c4489fbdae2eded.tar.bz2 spark-0f61d6efb45b9ee94fa663f67c4489fbdae2eded.zip |
[SPARK-15552][SQL] Remove unnecessary private[sql] methods in SparkSession
## What changes were proposed in this pull request?
SparkSession has a list of unnecessary private[sql] methods. These methods cause some trouble because private[sql] doesn't apply in Java. In the cases that they are easy to remove, we can simply remove them. This patch does that.
As part of this pull request, I also replaced a bunch of protected[sql] with private[sql], to tighten up visibility.
## How was this patch tested?
Updated test cases to reflect the changes.
Author: Reynold Xin <rxin@databricks.com>
Closes #13319 from rxin/SPARK-15552.
Diffstat (limited to 'sql/hive-thriftserver')
-rw-r--r-- | sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkSQLDriver.scala | 2 |
1 files changed, 1 insertions, 1 deletions
diff --git a/sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkSQLDriver.scala b/sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkSQLDriver.scala index c24e474d9c..0d5dc7af5f 100644 --- a/sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkSQLDriver.scala +++ b/sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkSQLDriver.scala @@ -59,7 +59,7 @@ private[hive] class SparkSQLDriver(val context: SQLContext = SparkSQLEnv.sqlCont // TODO unify the error code try { context.sparkContext.setJobDescription(command) - val execution = context.executePlan(context.sql(command).logicalPlan) + val execution = context.sessionState.executePlan(context.sql(command).logicalPlan) hiveResponse = execution.hiveResultString() tableSchema = getResultSetSchema(execution) new CommandProcessorResponse(0) |