diff options
author | Boaz Mohar <boazmohar@gmail.com> | 2017-02-25 11:32:09 -0800 |
---|---|---|
committer | Xiao Li <gatorsmile@gmail.com> | 2017-02-25 11:32:09 -0800 |
commit | 061bcfb869fe5f64edd9ee2352fecd70665da317 (patch) | |
tree | 84e9dfa741cad10ee7fbb378ce462f5fe5c92a5e | |
parent | 8f0511ed49a353fb0745f320a84063ced5cc1857 (diff) | |
download | spark-061bcfb869fe5f64edd9ee2352fecd70665da317.tar.gz spark-061bcfb869fe5f64edd9ee2352fecd70665da317.tar.bz2 spark-061bcfb869fe5f64edd9ee2352fecd70665da317.zip |
[MINOR][DOCS] Fixes two problems in the SQL programing guide page
## What changes were proposed in this pull request?
Removed duplicated lines in sql python example and found a typo.
## How was this patch tested?
Searched for other typo's in the page to minimize PR's.
Author: Boaz Mohar <boazmohar@gmail.com>
Closes #17066 from boazmohar/doc-fix.
-rw-r--r-- | docs/sql-programming-guide.md | 2 | ||||
-rw-r--r-- | examples/src/main/python/sql/basic.py | 3 |
2 files changed, 1 insertions, 4 deletions
diff --git a/docs/sql-programming-guide.md b/docs/sql-programming-guide.md index 235f5ecc40..2dd1ab6ef3 100644 --- a/docs/sql-programming-guide.md +++ b/docs/sql-programming-guide.md @@ -1410,7 +1410,7 @@ Thrift JDBC server also supports sending thrift RPC messages over HTTP transport Use the following setting to enable HTTP mode as system property or in `hive-site.xml` file in `conf/`: hive.server2.transport.mode - Set this to value: http - hive.server2.thrift.http.port - HTTP port number fo listen on; default is 10001 + hive.server2.thrift.http.port - HTTP port number to listen on; default is 10001 hive.server2.http.endpoint - HTTP endpoint; default is cliservice To test, use beeline to connect to the JDBC/ODBC server in http mode with: diff --git a/examples/src/main/python/sql/basic.py b/examples/src/main/python/sql/basic.py index ebcf66995b..c07fa8f275 100644 --- a/examples/src/main/python/sql/basic.py +++ b/examples/src/main/python/sql/basic.py @@ -187,9 +187,6 @@ def programmatic_schema_example(spark): # Creates a temporary view using the DataFrame schemaPeople.createOrReplaceTempView("people") - # Creates a temporary view using the DataFrame - schemaPeople.createOrReplaceTempView("people") - # SQL can be run over DataFrames that have been registered as a table. results = spark.sql("SELECT name FROM people") |