diff options
author | Cheng Hao <hao.cheng@intel.com> | 2016-07-05 16:42:43 -0700 |
---|---|---|
committer | Reynold Xin <rxin@databricks.com> | 2016-07-05 16:42:43 -0700 |
commit | 920cb5fe4ed0eb008cd14bf0ea45ed5b225b5022 (patch) | |
tree | 871f5f0b6631c23aa2599a0f35ef2a911cee3d02 /sql/hive-thriftserver/src/main | |
parent | 5b7a1770ac9cf36a1e92b31d10fe6eeee92fef17 (diff) | |
download | spark-920cb5fe4ed0eb008cd14bf0ea45ed5b225b5022.tar.gz spark-920cb5fe4ed0eb008cd14bf0ea45ed5b225b5022.tar.bz2 spark-920cb5fe4ed0eb008cd14bf0ea45ed5b225b5022.zip |
[SPARK-15730][SQL] Respect the --hiveconf in the spark-sql command line
## What changes were proposed in this pull request?
This PR makes spark-sql (backed by SparkSQLCLIDriver) respects confs set by hiveconf, which is what we do in previous versions. The change is that when we start SparkSQLCLIDriver, we explicitly set confs set through --hiveconf to SQLContext's conf (basically treating those confs as a SparkSQL conf).
## How was this patch tested?
A new test in CliSuite.
Closes #13542
Author: Cheng Hao <hao.cheng@intel.com>
Author: Yin Huai <yhuai@databricks.com>
Closes #14058 from yhuai/hiveConfThriftServer.
Diffstat (limited to 'sql/hive-thriftserver/src/main')
-rw-r--r-- | sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkSQLCLIDriver.scala | 8 |
1 files changed, 8 insertions, 0 deletions
diff --git a/sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkSQLCLIDriver.scala b/sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkSQLCLIDriver.scala index 7389e18aef..5dafec1c30 100644 --- a/sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkSQLCLIDriver.scala +++ b/sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkSQLCLIDriver.scala @@ -156,6 +156,14 @@ private[hive] object SparkSQLCLIDriver extends Logging { // Execute -i init files (always in silent mode) cli.processInitFiles(sessionState) + // Respect the configurations set by --hiveconf from the command line + // (based on Hive's CliDriver). + val it = sessionState.getOverriddenConfigurations.entrySet().iterator() + while (it.hasNext) { + val kv = it.next() + SparkSQLEnv.sqlContext.setConf(kv.getKey, kv.getValue) + } + if (sessionState.execString != null) { System.exit(cli.processLine(sessionState.execString)) } |