diff options
author | Wenchen Fan <wenchen@databricks.com> | 2016-10-11 20:27:08 -0700 |
---|---|---|
committer | gatorsmile <gatorsmile@gmail.com> | 2016-10-11 20:27:08 -0700 |
commit | b9a147181d5e38d9abed0c7215f4c5cb695f579c (patch) | |
tree | e2c3bda2b680a67b914b2fc737bbafb0350efb93 /repl/scala-2.11/src/main | |
parent | 5b77e66dd6a128c5992ab3bde418613f84be7009 (diff) | |
download | spark-b9a147181d5e38d9abed0c7215f4c5cb695f579c.tar.gz spark-b9a147181d5e38d9abed0c7215f4c5cb695f579c.tar.bz2 spark-b9a147181d5e38d9abed0c7215f4c5cb695f579c.zip |
[SPARK-17720][SQL] introduce static SQL conf
## What changes were proposed in this pull request?
SQLConf is session-scoped and mutable. However, we do have the requirement for a static SQL conf, which is global and immutable, e.g. the `schemaStringThreshold` in `HiveExternalCatalog`, the flag to enable/disable hive support, the global temp view database in https://github.com/apache/spark/pull/14897.
Actually we've already implemented static SQL conf implicitly via `SparkConf`, this PR just make it explicit and expose it to users, so that they can see the config value via SQL command or `SparkSession.conf`, and forbid users to set/unset static SQL conf.
## How was this patch tested?
new tests in SQLConfSuite
Author: Wenchen Fan <wenchen@databricks.com>
Closes #15295 from cloud-fan/global-conf.
Diffstat (limited to 'repl/scala-2.11/src/main')
-rw-r--r-- | repl/scala-2.11/src/main/scala/org/apache/spark/repl/Main.scala | 2 |
1 files changed, 1 insertions, 1 deletions
diff --git a/repl/scala-2.11/src/main/scala/org/apache/spark/repl/Main.scala b/repl/scala-2.11/src/main/scala/org/apache/spark/repl/Main.scala index 5dfe18ad49..fec4d49379 100644 --- a/repl/scala-2.11/src/main/scala/org/apache/spark/repl/Main.scala +++ b/repl/scala-2.11/src/main/scala/org/apache/spark/repl/Main.scala @@ -22,9 +22,9 @@ import java.io.File import scala.tools.nsc.GenericRunnerSettings import org.apache.spark._ -import org.apache.spark.internal.config.CATALOG_IMPLEMENTATION import org.apache.spark.internal.Logging import org.apache.spark.sql.SparkSession +import org.apache.spark.sql.internal.StaticSQLConf.CATALOG_IMPLEMENTATION import org.apache.spark.util.Utils object Main extends Logging { |