aboutsummaryrefslogtreecommitdiff
path: root/python/pyspark/sql/session.py
diff options
context:
space:
mode:
authorhyukjinkwon <gurwls223@gmail.com>2016-07-06 10:45:51 -0700
committerReynold Xin <rxin@databricks.com>2016-07-06 10:45:51 -0700
commit4e14199ff740ea186eb2cec2e5cf901b58c5f90e (patch)
treecfd7850c821e764c2243615a8fd8642d73323da1 /python/pyspark/sql/session.py
parentb1310425b30cbd711e4834d65a0accb3c5a8403a (diff)
downloadspark-4e14199ff740ea186eb2cec2e5cf901b58c5f90e.tar.gz
spark-4e14199ff740ea186eb2cec2e5cf901b58c5f90e.tar.bz2
spark-4e14199ff740ea186eb2cec2e5cf901b58c5f90e.zip
[MINOR][PYSPARK][DOC] Fix wrongly formatted examples in PySpark documentation
## What changes were proposed in this pull request? This PR fixes wrongly formatted examples in PySpark documentation as below: - **`SparkSession`** - **Before** ![2016-07-06 11 34 41](https://cloud.githubusercontent.com/assets/6477701/16605847/ae939526-436d-11e6-8ab8-6ad578362425.png) - **After** ![2016-07-06 11 33 56](https://cloud.githubusercontent.com/assets/6477701/16605845/ace9ee78-436d-11e6-8923-b76d4fc3e7c3.png) - **`Builder`** - **Before** ![2016-07-06 11 34 44](https://cloud.githubusercontent.com/assets/6477701/16605844/aba60dbc-436d-11e6-990a-c87bc0281c6b.png) - **After** ![2016-07-06 1 26 37](https://cloud.githubusercontent.com/assets/6477701/16607562/586704c0-437d-11e6-9483-e0af93d8f74e.png) This PR also fixes several similar instances across the documentation in `sql` PySpark module. ## How was this patch tested? N/A Author: hyukjinkwon <gurwls223@gmail.com> Closes #14063 from HyukjinKwon/minor-pyspark-builder.
Diffstat (limited to 'python/pyspark/sql/session.py')
-rw-r--r--python/pyspark/sql/session.py13
1 files changed, 7 insertions, 6 deletions
diff --git a/python/pyspark/sql/session.py b/python/pyspark/sql/session.py
index 55f86a16f5..a360fbefa4 100644
--- a/python/pyspark/sql/session.py
+++ b/python/pyspark/sql/session.py
@@ -66,12 +66,11 @@ class SparkSession(object):
tables, execute SQL over tables, cache tables, and read parquet files.
To create a SparkSession, use the following builder pattern:
- >>> spark = SparkSession.builder \
- .master("local") \
- .appName("Word Count") \
- .config("spark.some.config.option", "some-value") \
- .getOrCreate()
-
+ >>> spark = SparkSession.builder \\
+ ... .master("local") \\
+ ... .appName("Word Count") \\
+ ... .config("spark.some.config.option", "some-value") \\
+ ... .getOrCreate()
"""
class Builder(object):
@@ -87,11 +86,13 @@ class SparkSession(object):
both :class:`SparkConf` and :class:`SparkSession`'s own configuration.
For an existing SparkConf, use `conf` parameter.
+
>>> from pyspark.conf import SparkConf
>>> SparkSession.builder.config(conf=SparkConf())
<pyspark.sql.session...
For a (key, value) pair, you can omit parameter names.
+
>>> SparkSession.builder.config("spark.some.config.option", "some-value")
<pyspark.sql.session...