aboutsummaryrefslogtreecommitdiff
path: root/docs/python-programming-guide.md
diff options
context:
space:
mode:
authorReza Zadeh <rizlar@gmail.com>2014-01-02 01:50:30 -0800
committerReza Zadeh <rizlar@gmail.com>2014-01-02 01:50:30 -0800
commit61405785bc561b55681100fc3ef7e15ae8c4b113 (patch)
tree6cbef0767dbf16942d90e335dbfef8019436378f /docs/python-programming-guide.md
parent2612164f85ae3249c78c130fc51427ace33b3580 (diff)
parent3713f8129a618a633a7aca8c944960c3e7ac9d3b (diff)
downloadspark-61405785bc561b55681100fc3ef7e15ae8c4b113.tar.gz
spark-61405785bc561b55681100fc3ef7e15ae8c4b113.tar.bz2
spark-61405785bc561b55681100fc3ef7e15ae8c4b113.zip
Merge remote-tracking branch 'upstream/master' into sparsesvd
Diffstat (limited to 'docs/python-programming-guide.md')
-rw-r--r--docs/python-programming-guide.md15
1 files changed, 8 insertions, 7 deletions
diff --git a/docs/python-programming-guide.md b/docs/python-programming-guide.md
index 55e39b1de1..96f93e24fe 100644
--- a/docs/python-programming-guide.md
+++ b/docs/python-programming-guide.md
@@ -131,15 +131,16 @@ sc = SparkContext("local", "App Name", pyFiles=['MyFile.py', 'lib.zip', 'app.egg
Files listed here will be added to the `PYTHONPATH` and shipped to remote worker machines.
Code dependencies can be added to an existing SparkContext using its `addPyFile()` method.
-You can set [system properties](configuration.html#system-properties)
-using `SparkContext.setSystemProperty()` class method *before*
-instantiating SparkContext. For example, to set the amount of memory
-per executor process:
+You can set [configuration properties](configuration.html#spark-properties) by passing a
+[SparkConf](api/pyspark/pyspark.conf.SparkConf-class.html) object to SparkContext:
{% highlight python %}
-from pyspark import SparkContext
-SparkContext.setSystemProperty('spark.executor.memory', '2g')
-sc = SparkContext("local", "App Name")
+from pyspark import SparkConf, SparkContext
+conf = (SparkConf()
+ .setMaster("local")
+ .setAppName("My app")
+ .set("spark.executor.memory", "1g"))
+sc = SparkContext(conf = conf)
{% endhighlight %}
# API Docs