aboutsummaryrefslogtreecommitdiff
path: root/docs/sql-programming-guide.md
diff options
context:
space:
mode:
authorDaoyuan Wang <daoyuan.wang@intel.com>2016-03-16 22:52:10 -0700
committerReynold Xin <rxin@databricks.com>2016-03-16 22:52:10 -0700
commitd1c193a2f1a5e2b98f5df1b86d7a7ec0ced13668 (patch)
tree94aef4269107c7dbb650980116c9bd2ca72566ba /docs/sql-programming-guide.md
parentc890c359b1dfb64274d1d0067b1e16d834035f11 (diff)
downloadspark-d1c193a2f1a5e2b98f5df1b86d7a7ec0ced13668.tar.gz
spark-d1c193a2f1a5e2b98f5df1b86d7a7ec0ced13668.tar.bz2
spark-d1c193a2f1a5e2b98f5df1b86d7a7ec0ced13668.zip
[SPARK-12855][MINOR][SQL][DOC][TEST] remove spark.sql.dialect from doc and test
## What changes were proposed in this pull request? Since developer API of plug-able parser has been removed in #10801 , docs should be updated accordingly. ## How was this patch tested? This patch will not affect the real code path. Author: Daoyuan Wang <daoyuan.wang@intel.com> Closes #11758 from adrian-wang/spark12855.
Diffstat (limited to 'docs/sql-programming-guide.md')
-rw-r--r--docs/sql-programming-guide.md7
1 files changed, 0 insertions, 7 deletions
diff --git a/docs/sql-programming-guide.md b/docs/sql-programming-guide.md
index 3138fd5fb4..2fdc97f8a0 100644
--- a/docs/sql-programming-guide.md
+++ b/docs/sql-programming-guide.md
@@ -122,13 +122,6 @@ Spark build. If these dependencies are not a problem for your application then u
is recommended for the 1.3 release of Spark. Future releases will focus on bringing `SQLContext` up
to feature parity with a `HiveContext`.
-The specific variant of SQL that is used to parse queries can also be selected using the
-`spark.sql.dialect` option. This parameter can be changed using either the `setConf` method on
-a `SQLContext` or by using a `SET key=value` command in SQL. For a `SQLContext`, the only dialect
-available is "sql" which uses a simple SQL parser provided by Spark SQL. In a `HiveContext`, the
-default is "hiveql", though "sql" is also available. Since the HiveQL parser is much more complete,
-this is recommended for most use cases.
-
## Creating DataFrames