diff options
author | gatorsmile <gatorsmile@gmail.com> | 2016-05-21 23:12:27 -0700 |
---|---|---|
committer | Reynold Xin <rxin@databricks.com> | 2016-05-21 23:12:27 -0700 |
commit | 6cb8f836da197eec17d33e4a547340c15e59d091 (patch) | |
tree | 69dbd6460c25de8bd473da29629d82f4ff1b50e2 /examples/src/main/scala/org | |
parent | 223f6339088434eb3590c2f42091a38f05f1e5db (diff) | |
download | spark-6cb8f836da197eec17d33e4a547340c15e59d091.tar.gz spark-6cb8f836da197eec17d33e4a547340c15e59d091.tar.bz2 spark-6cb8f836da197eec17d33e4a547340c15e59d091.zip |
[SPARK-15396][SQL][DOC] It can't connect hive metastore database
#### What changes were proposed in this pull request?
The `hive.metastore.warehouse.dir` property in hive-site.xml is deprecated since Spark 2.0.0. Users might not be able to connect to the existing metastore if they do not use the new conf parameter `spark.sql.warehouse.dir`.
This PR is to update the document and example for explaining the latest changes in the configuration of default location of database.
Below is the screenshot of the latest generated docs:
<img width="681" alt="screenshot 2016-05-20 08 38 10" src="https://cloud.githubusercontent.com/assets/11567269/15433296/a05c4ace-1e66-11e6-8d2b-73682b32e9c2.png">
<img width="789" alt="screenshot 2016-05-20 08 53 26" src="https://cloud.githubusercontent.com/assets/11567269/15433734/645dc42e-1e68-11e6-9476-effc9f8721bb.png">
<img width="789" alt="screenshot 2016-05-20 08 53 37" src="https://cloud.githubusercontent.com/assets/11567269/15433738/68569f92-1e68-11e6-83d3-ef5bb221a8d8.png">
No change is made in the R's example.
<img width="860" alt="screenshot 2016-05-20 08 54 38" src="https://cloud.githubusercontent.com/assets/11567269/15433779/965b8312-1e68-11e6-8bc4-53c88ceacde2.png">
#### How was this patch tested?
N/A
Author: gatorsmile <gatorsmile@gmail.com>
Closes #13225 from gatorsmile/document.
Diffstat (limited to 'examples/src/main/scala/org')
-rw-r--r-- | examples/src/main/scala/org/apache/spark/examples/sql/hive/HiveFromSpark.scala | 11 |
1 files changed, 7 insertions, 4 deletions
diff --git a/examples/src/main/scala/org/apache/spark/examples/sql/hive/HiveFromSpark.scala b/examples/src/main/scala/org/apache/spark/examples/sql/hive/HiveFromSpark.scala index 59bdfa09ad..d3bb7e4398 100644 --- a/examples/src/main/scala/org/apache/spark/examples/sql/hive/HiveFromSpark.scala +++ b/examples/src/main/scala/org/apache/spark/examples/sql/hive/HiveFromSpark.scala @@ -37,10 +37,13 @@ object HiveFromSpark { def main(args: Array[String]) { val sparkConf = new SparkConf().setAppName("HiveFromSpark") - // A hive context adds support for finding tables in the MetaStore and writing queries - // using HiveQL. Users who do not have an existing Hive deployment can still create a - // HiveContext. When not configured by the hive-site.xml, the context automatically - // creates metastore_db and warehouse in the current directory. + // When working with Hive, one must instantiate `SparkSession` with Hive support, including + // connectivity to a persistent Hive metastore, support for Hive serdes, and Hive user-defined + // functions. Users who do not have an existing Hive deployment can still enable Hive support. + // When not configured by the hive-site.xml, the context automatically creates `metastore_db` + // in the current directory and creates a directory configured by `spark.sql.warehouse.dir`, + // which defaults to the directory `spark-warehouse` in the current directory that the spark + // application is started. val spark = SparkSession.builder .config(sparkConf) .enableHiveSupport() |