From 478b2b7edcf42fa3e16f625d4b8676f2bb31f8dc Mon Sep 17 00:00:00 2001 From: Matei Zaharia Date: Wed, 9 Oct 2013 12:08:04 -0700 Subject: Fix PySpark docs and an overly long line of code after fdbae41e --- docs/python-programming-guide.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) (limited to 'docs/python-programming-guide.md') diff --git a/docs/python-programming-guide.md b/docs/python-programming-guide.md index f67a1cc49c..6c2336ad0c 100644 --- a/docs/python-programming-guide.md +++ b/docs/python-programming-guide.md @@ -16,7 +16,7 @@ This guide will show how to use the Spark features described there in Python. There are a few key differences between the Python and Scala APIs: * Python is dynamically typed, so RDDs can hold objects of multiple types. -* PySpark does not yet support a few API calls, such as `lookup`, `sort`, and non-text input files, though these will be added in future releases. +* PySpark does not yet support a few API calls, such as `lookup` and non-text input files, though these will be added in future releases. In PySpark, RDDs support the same methods as their Scala counterparts but take Python functions and return Python collection types. Short functions can be passed to RDD methods using Python's [`lambda`](http://www.diveintopython.net/power_of_introspection/lambda_functions.html) syntax: -- cgit v1.2.3