aboutsummaryrefslogtreecommitdiff
path: root/core
diff options
context:
space:
mode:
authorJosh Rosen <joshrosen@apache.org>2014-07-26 17:37:05 -0700
committerMatei Zaharia <matei@databricks.com>2014-07-26 17:37:05 -0700
commitba46bbed5d32aec0f11f0b71c82bba8dbe19f05a (patch)
tree5826bc60fdb70aebf9b0a9e3887dbce96d526851 /core
parent12901643b7e808aa75cf0b19e2d0c3d40b1a978d (diff)
downloadspark-ba46bbed5d32aec0f11f0b71c82bba8dbe19f05a.tar.gz
spark-ba46bbed5d32aec0f11f0b71c82bba8dbe19f05a.tar.bz2
spark-ba46bbed5d32aec0f11f0b71c82bba8dbe19f05a.zip
[SPARK-2601] [PySpark] Fix Py4J error when transforming pickleFiles
Similar to SPARK-1034, the problem was that Py4J didn’t cope well with the fake ClassTags used in the Java API. It doesn’t look like there’s any reason why PythonRDD needs to take a ClassTag, since it just ignores the type of the previous RDD, so I removed the type parameter and we no longer pass ClassTags from Python. Author: Josh Rosen <joshrosen@apache.org> Closes #1605 from JoshRosen/spark-2601 and squashes the following commits: b68e118 [Josh Rosen] Fix Py4J error when transforming pickleFiles [SPARK-2601]
Diffstat (limited to 'core')
-rw-r--r--core/src/main/scala/org/apache/spark/api/python/PythonRDD.scala4
1 files changed, 2 insertions, 2 deletions
diff --git a/core/src/main/scala/org/apache/spark/api/python/PythonRDD.scala b/core/src/main/scala/org/apache/spark/api/python/PythonRDD.scala
index d6b0988641..d87783efd2 100644
--- a/core/src/main/scala/org/apache/spark/api/python/PythonRDD.scala
+++ b/core/src/main/scala/org/apache/spark/api/python/PythonRDD.scala
@@ -37,8 +37,8 @@ import org.apache.spark.broadcast.Broadcast
import org.apache.spark.rdd.RDD
import org.apache.spark.util.Utils
-private[spark] class PythonRDD[T: ClassTag](
- parent: RDD[T],
+private[spark] class PythonRDD(
+ parent: RDD[_],
command: Array[Byte],
envVars: JMap[String, String],
pythonIncludes: JList[String],