diff options
author | Sandy Ryza <sandy@cloudera.com> | 2014-03-13 12:11:33 -0700 |
---|---|---|
committer | Patrick Wendell <pwendell@gmail.com> | 2014-03-13 12:11:33 -0700 |
commit | 698373211ef3cdf841c82d48168cd5dbe00a57b4 (patch) | |
tree | a07edbe4835a7b01aa48cf9bd35c0d6939d21d78 /docs/python-programming-guide.md | |
parent | e4e8d8f395aea48f0cae00d7c381a863c48a2837 (diff) | |
download | spark-698373211ef3cdf841c82d48168cd5dbe00a57b4.tar.gz spark-698373211ef3cdf841c82d48168cd5dbe00a57b4.tar.bz2 spark-698373211ef3cdf841c82d48168cd5dbe00a57b4.zip |
SPARK-1183. Don't use "worker" to mean executor
Author: Sandy Ryza <sandy@cloudera.com>
Closes #120 from sryza/sandy-spark-1183 and squashes the following commits:
5066a4a [Sandy Ryza] Remove "worker" in a couple comments
0bd1e46 [Sandy Ryza] Remove --am-class from usage
bfc8fe0 [Sandy Ryza] Remove am-class from doc and fix yarn-alpha
607539f [Sandy Ryza] Address review comments
74d087a [Sandy Ryza] SPARK-1183. Don't use "worker" to mean executor
Diffstat (limited to 'docs/python-programming-guide.md')
-rw-r--r-- | docs/python-programming-guide.md | 6 |
1 files changed, 3 insertions, 3 deletions
diff --git a/docs/python-programming-guide.md b/docs/python-programming-guide.md index 57ed54c9cf..cbe7d820b4 100644 --- a/docs/python-programming-guide.md +++ b/docs/python-programming-guide.md @@ -43,9 +43,9 @@ def is_error(line): errors = logData.filter(is_error) {% endhighlight %} -PySpark will automatically ship these functions to workers, along with any objects that they reference. -Instances of classes will be serialized and shipped to workers by PySpark, but classes themselves cannot be automatically distributed to workers. -The [Standalone Use](#standalone-use) section describes how to ship code dependencies to workers. +PySpark will automatically ship these functions to executors, along with any objects that they reference. +Instances of classes will be serialized and shipped to executors by PySpark, but classes themselves cannot be automatically distributed to executors. +The [Standalone Use](#standalone-use) section describes how to ship code dependencies to executors. In addition, PySpark fully supports interactive use---simply run `./bin/pyspark` to launch an interactive shell. |