diff options
author | Kay Ousterhout <kayousterhout@gmail.com> | 2015-02-19 09:49:34 +0800 |
---|---|---|
committer | Cheng Lian <lian@databricks.com> | 2015-02-19 09:49:34 +0800 |
commit | e945aa6139e022d13ac793f46819cfee07b782fc (patch) | |
tree | b425c45132c5129ede5b81849af629f8ccf883a2 /sql/hive-thriftserver/v0.13.1 | |
parent | d12d2ad76ee673b819c92dd8093ba0a560847761 (diff) | |
download | spark-e945aa6139e022d13ac793f46819cfee07b782fc.tar.gz spark-e945aa6139e022d13ac793f46819cfee07b782fc.tar.bz2 spark-e945aa6139e022d13ac793f46819cfee07b782fc.zip |
[SPARK-5846] Correctly set job description and pool for SQL jobs
marmbrus am I missing something obvious here? I verified that this fixes the problem for me (on 1.2.1) on EC2, but I'm confused about how others wouldn't have noticed this?
Author: Kay Ousterhout <kayousterhout@gmail.com>
Closes #4630 from kayousterhout/SPARK-5846_1.3 and squashes the following commits:
2022ad4 [Kay Ousterhout] [SPARK-5846] Correctly set job description and pool for SQL jobs
Diffstat (limited to 'sql/hive-thriftserver/v0.13.1')
-rw-r--r-- | sql/hive-thriftserver/v0.13.1/src/main/scala/org/apache/spark/sql/hive/thriftserver/Shim13.scala | 8 |
1 files changed, 4 insertions, 4 deletions
diff --git a/sql/hive-thriftserver/v0.13.1/src/main/scala/org/apache/spark/sql/hive/thriftserver/Shim13.scala b/sql/hive-thriftserver/v0.13.1/src/main/scala/org/apache/spark/sql/hive/thriftserver/Shim13.scala index 71e3954b2c..9b8faeff94 100644 --- a/sql/hive-thriftserver/v0.13.1/src/main/scala/org/apache/spark/sql/hive/thriftserver/Shim13.scala +++ b/sql/hive-thriftserver/v0.13.1/src/main/scala/org/apache/spark/sql/hive/thriftserver/Shim13.scala @@ -156,6 +156,10 @@ private[hive] class SparkExecuteStatementOperation( def run(): Unit = { logInfo(s"Running query '$statement'") setState(OperationState.RUNNING) + hiveContext.sparkContext.setJobDescription(statement) + sessionToActivePool.get(parentSession.getSessionHandle).foreach { pool => + hiveContext.sparkContext.setLocalProperty("spark.scheduler.pool", pool) + } try { result = hiveContext.sql(statement) logDebug(result.queryExecution.toString()) @@ -165,10 +169,6 @@ private[hive] class SparkExecuteStatementOperation( logInfo(s"Setting spark.scheduler.pool=$value for future statements in this session.") case _ => } - hiveContext.sparkContext.setJobDescription(statement) - sessionToActivePool.get(parentSession.getSessionHandle).foreach { pool => - hiveContext.sparkContext.setLocalProperty("spark.scheduler.pool", pool) - } iter = { val useIncrementalCollect = hiveContext.getConf("spark.sql.thriftServer.incrementalCollect", "false").toBoolean |