diff options
author | Josh Rosen <joshrosen@databricks.com> | 2016-03-14 12:22:02 -0700 |
---|---|---|
committer | Josh Rosen <joshrosen@databricks.com> | 2016-03-14 12:22:02 -0700 |
commit | 07cb323e7a128b87ef265ddc66f033365d9de463 (patch) | |
tree | 41de1d7201b8592ffd8742aa09a98ee1beb207e6 /core | |
parent | 6a4bfcd62b7effcfbb865bdd301d41a0ba6e5c94 (diff) | |
download | spark-07cb323e7a128b87ef265ddc66f033365d9de463.tar.gz spark-07cb323e7a128b87ef265ddc66f033365d9de463.tar.bz2 spark-07cb323e7a128b87ef265ddc66f033365d9de463.zip |
[SPARK-13848][SPARK-5185] Update to Py4J 0.9.2 in order to fix classloading issue
This patch upgrades Py4J from 0.9.1 to 0.9.2 in order to include a patch which modifies Py4J to use the current thread's ContextClassLoader when performing reflection / class loading. This is necessary in order to fix [SPARK-5185](https://issues.apache.org/jira/browse/SPARK-5185), a longstanding issue affecting the use of `--jars` and `--packages` in PySpark.
In order to demonstrate that the fix works, I removed the workarounds which were added as part of [SPARK-6027](https://issues.apache.org/jira/browse/SPARK-6027) / #4779 and other patches.
Py4J diff: https://github.com/bartdag/py4j/compare/0.9.1...0.9.2
/cc zsxwing tdas davies brkyvz
Author: Josh Rosen <joshrosen@databricks.com>
Closes #11687 from JoshRosen/py4j-0.9.2.
Diffstat (limited to 'core')
-rw-r--r-- | core/pom.xml | 2 | ||||
-rw-r--r-- | core/src/main/scala/org/apache/spark/api/python/PythonUtils.scala | 2 |
2 files changed, 2 insertions, 2 deletions
diff --git a/core/pom.xml b/core/pom.xml index be40d9936a..4c7e3a3662 100644 --- a/core/pom.xml +++ b/core/pom.xml @@ -314,7 +314,7 @@ <dependency> <groupId>net.sf.py4j</groupId> <artifactId>py4j</artifactId> - <version>0.9.1</version> + <version>0.9.2</version> </dependency> <dependency> <groupId>org.apache.spark</groupId> diff --git a/core/src/main/scala/org/apache/spark/api/python/PythonUtils.scala b/core/src/main/scala/org/apache/spark/api/python/PythonUtils.scala index bda872746c..8bcd2903fe 100644 --- a/core/src/main/scala/org/apache/spark/api/python/PythonUtils.scala +++ b/core/src/main/scala/org/apache/spark/api/python/PythonUtils.scala @@ -32,7 +32,7 @@ private[spark] object PythonUtils { val pythonPath = new ArrayBuffer[String] for (sparkHome <- sys.env.get("SPARK_HOME")) { pythonPath += Seq(sparkHome, "python", "lib", "pyspark.zip").mkString(File.separator) - pythonPath += Seq(sparkHome, "python", "lib", "py4j-0.9.1-src.zip").mkString(File.separator) + pythonPath += Seq(sparkHome, "python", "lib", "py4j-0.9.2-src.zip").mkString(File.separator) } pythonPath ++= SparkContext.jarOfObject(this) pythonPath.mkString(File.pathSeparator) |