aboutsummaryrefslogtreecommitdiff
diff options
context:
space:
mode:
authorgatorsmile <gatorsmile@gmail.com>2016-05-23 21:07:14 -0700
committerReynold Xin <rxin@databricks.com>2016-05-23 21:07:14 -0700
commitd207716451f722c899b3845ee454f1e16c094125 (patch)
treec51a2bcdefe80b53be084b3c38d2546866a9f35c
parenta15ca5533db91fefaf3248255a59c4d94eeda1a9 (diff)
downloadspark-d207716451f722c899b3845ee454f1e16c094125.tar.gz
spark-d207716451f722c899b3845ee454f1e16c094125.tar.bz2
spark-d207716451f722c899b3845ee454f1e16c094125.zip
[SPARK-15485][SQL][DOCS] Spark SQL Configuration
#### What changes were proposed in this pull request? So far, the page Configuration in the official documentation does not have a section for Spark SQL. http://spark.apache.org/docs/latest/configuration.html For Spark users, the information and default values of these public configuration parameters are very useful. This PR is to add this missing section to the configuration.html. rxin yhuai marmbrus #### How was this patch tested? Below is the generated webpage. <img width="924" alt="screenshot 2016-05-23 11 35 57" src="https://cloud.githubusercontent.com/assets/11567269/15480492/b08fefc4-20da-11e6-9fa2-7cd5b699ed35.png"> <img width="914" alt="screenshot 2016-05-23 11 37 38" src="https://cloud.githubusercontent.com/assets/11567269/15480499/c5f9482e-20da-11e6-95ff-10821add1af4.png"> <img width="923" alt="screenshot 2016-05-23 11 36 11" src="https://cloud.githubusercontent.com/assets/11567269/15480506/cbd81644-20da-11e6-9d27-effb716b2fac.png"> <img width="920" alt="screenshot 2016-05-23 11 36 18" src="https://cloud.githubusercontent.com/assets/11567269/15480511/d013e332-20da-11e6-854a-cf8813c46f36.png"> Author: gatorsmile <gatorsmile@gmail.com> Closes #13263 from gatorsmile/configurationSQL.
-rw-r--r--docs/configuration.md42
1 files changed, 42 insertions, 0 deletions
diff --git a/docs/configuration.md b/docs/configuration.md
index d23f0fe1a1..d6471a8cc7 100644
--- a/docs/configuration.md
+++ b/docs/configuration.md
@@ -1480,6 +1480,48 @@ Apart from these, the following properties are also available, and may be useful
</table>
+#### Spark SQL
+Running the <code>SET -v</code> command will show the entire list of the SQL configuration.
+
+<div class="codetabs">
+<div data-lang="scala" markdown="1">
+
+{% highlight scala %}
+// spark is an existing SparkSession
+spark.sql("SET -v").show(numRows = 200, truncate = false)
+{% endhighlight %}
+
+</div>
+
+<div data-lang="java" markdown="1">
+
+{% highlight java %}
+// spark is an existing SparkSession
+spark.sql("SET -v").show(200, false);
+{% endhighlight %}
+</div>
+
+<div data-lang="python" markdown="1">
+
+{% highlight python %}
+# spark is an existing SparkSession
+spark.sql("SET -v").show(n=200, truncate=False)
+{% endhighlight %}
+
+</div>
+
+<div data-lang="r" markdown="1">
+
+{% highlight r %}
+# sqlContext is an existing sqlContext.
+properties <- sql(sqlContext, "SET -v")
+showDF(properties, numRows = 200, truncate = FALSE)
+{% endhighlight %}
+
+</div>
+</div>
+
+
#### Spark Streaming
<table class="table">
<tr><th>Property Name</th><th>Default</th><th>Meaning</th></tr>