From 4cebb79c9f3067da0c533292de45d7ecf56f2ff2 Mon Sep 17 00:00:00 2001 From: Josh Rosen Date: Thu, 23 Jan 2014 20:01:36 -0800 Subject: Deprecate mapPartitionsWithSplit in PySpark. Also, replace the last reference to it in the docs. This fixes SPARK-1026. --- docs/scala-programming-guide.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) (limited to 'docs/scala-programming-guide.md') diff --git a/docs/scala-programming-guide.md b/docs/scala-programming-guide.md index c1ef46a1cd..7c0f67bc99 100644 --- a/docs/scala-programming-guide.md +++ b/docs/scala-programming-guide.md @@ -168,9 +168,9 @@ The following tables list the transformations and actions currently supported (s Iterator[T] => Iterator[U] when running on an RDD of type T. - mapPartitionsWithSplit(func) + mapPartitionsWithIndex(func) Similar to mapPartitions, but also provides func with an integer value representing the index of - the split, so func must be of type (Int, Iterator[T]) => Iterator[U] when running on an RDD of type T. + the partition, so func must be of type (Int, Iterator[T]) => Iterator[U] when running on an RDD of type T. -- cgit v1.2.3