aboutsummaryrefslogtreecommitdiff
diff options
context:
space:
mode:
authorMatei Zaharia <matei@eecs.berkeley.edu>2013-09-10 11:12:59 -0700
committerMatei Zaharia <matei@eecs.berkeley.edu>2013-09-10 11:12:59 -0700
commit2425eb85ca709273c48958f81a81c8a04657ea1f (patch)
tree3dc3483b9be8fbed40bca10a8541aa5f58f598ec
parent8c14f4b72269093a62510dbb2f1e954c103ffcd4 (diff)
downloadspark-2425eb85ca709273c48958f81a81c8a04657ea1f.tar.gz
spark-2425eb85ca709273c48958f81a81c8a04657ea1f.tar.bz2
spark-2425eb85ca709273c48958f81a81c8a04657ea1f.zip
Update Python API features
-rw-r--r--docs/python-programming-guide.md2
1 files changed, 1 insertions, 1 deletions
diff --git a/docs/python-programming-guide.md b/docs/python-programming-guide.md
index 5662e7d02a..f67a1cc49c 100644
--- a/docs/python-programming-guide.md
+++ b/docs/python-programming-guide.md
@@ -16,7 +16,7 @@ This guide will show how to use the Spark features described there in Python.
There are a few key differences between the Python and Scala APIs:
* Python is dynamically typed, so RDDs can hold objects of multiple types.
-* PySpark does not yet support a few API calls, such as `lookup`, `sort`, and `persist` at custom storage levels. See the [API docs](api/pyspark/index.html) for details.
+* PySpark does not yet support a few API calls, such as `lookup`, `sort`, and non-text input files, though these will be added in future releases.
In PySpark, RDDs support the same methods as their Scala counterparts but take Python functions and return Python collection types.
Short functions can be passed to RDD methods using Python's [`lambda`](http://www.diveintopython.net/power_of_introspection/lambda_functions.html) syntax: