aboutsummaryrefslogtreecommitdiff
path: root/R/pkg/inst/tests/testthat
diff options
context:
space:
mode:
authorWenchen Fan <wenchen@databricks.com>2016-10-11 20:27:08 -0700
committergatorsmile <gatorsmile@gmail.com>2016-10-11 20:27:08 -0700
commitb9a147181d5e38d9abed0c7215f4c5cb695f579c (patch)
treee2c3bda2b680a67b914b2fc737bbafb0350efb93 /R/pkg/inst/tests/testthat
parent5b77e66dd6a128c5992ab3bde418613f84be7009 (diff)
downloadspark-b9a147181d5e38d9abed0c7215f4c5cb695f579c.tar.gz
spark-b9a147181d5e38d9abed0c7215f4c5cb695f579c.tar.bz2
spark-b9a147181d5e38d9abed0c7215f4c5cb695f579c.zip
[SPARK-17720][SQL] introduce static SQL conf
## What changes were proposed in this pull request? SQLConf is session-scoped and mutable. However, we do have the requirement for a static SQL conf, which is global and immutable, e.g. the `schemaStringThreshold` in `HiveExternalCatalog`, the flag to enable/disable hive support, the global temp view database in https://github.com/apache/spark/pull/14897. Actually we've already implemented static SQL conf implicitly via `SparkConf`, this PR just make it explicit and expose it to users, so that they can see the config value via SQL command or `SparkSession.conf`, and forbid users to set/unset static SQL conf. ## How was this patch tested? new tests in SQLConfSuite Author: Wenchen Fan <wenchen@databricks.com> Closes #15295 from cloud-fan/global-conf.
Diffstat (limited to 'R/pkg/inst/tests/testthat')
-rw-r--r--R/pkg/inst/tests/testthat/test_sparkSQL.R2
1 files changed, 1 insertions, 1 deletions
diff --git a/R/pkg/inst/tests/testthat/test_sparkSQL.R b/R/pkg/inst/tests/testthat/test_sparkSQL.R
index 6d8cfad5c1..61554248ee 100644
--- a/R/pkg/inst/tests/testthat/test_sparkSQL.R
+++ b/R/pkg/inst/tests/testthat/test_sparkSQL.R
@@ -2609,7 +2609,7 @@ test_that("enableHiveSupport on SparkSession", {
unsetHiveContext()
# if we are still here, it must be built with hive
conf <- callJMethod(sparkSession, "conf")
- value <- callJMethod(conf, "get", "spark.sql.catalogImplementation", "")
+ value <- callJMethod(conf, "get", "spark.sql.catalogImplementation")
expect_equal(value, "hive")
})