diff options
author | Patrick Wendell <pwendell@gmail.com> | 2014-04-13 08:58:37 -0700 |
---|---|---|
committer | Patrick Wendell <pwendell@gmail.com> | 2014-04-13 08:58:37 -0700 |
commit | 4bc07eebbf5e2ea0c0b6f1642049515025d88d07 (patch) | |
tree | fc314a1c1d68055b04cdc37553669ea5f12c628b /sql/catalyst | |
parent | ca11919e6e97a62eb3e3ce882ffa29eae36f50f7 (diff) | |
download | spark-4bc07eebbf5e2ea0c0b6f1642049515025d88d07.tar.gz spark-4bc07eebbf5e2ea0c0b6f1642049515025d88d07.tar.bz2 spark-4bc07eebbf5e2ea0c0b6f1642049515025d88d07.zip |
SPARK-1480: Clean up use of classloaders
The Spark codebase is a bit fast-and-loose when accessing classloaders and this has caused a few bugs to surface in master.
This patch defines some utility methods for accessing classloaders. This makes the intention when accessing a classloader much more explicit in the code and fixes a few cases where the wrong one was chosen.
case (a) -> We want the classloader that loaded Spark
case (b) -> We want the context class loader, or if not present, we want (a)
This patch provides a better fix for SPARK-1403 (https://issues.apache.org/jira/browse/SPARK-1403) than the current work around, which it reverts. It also fixes a previously unreported bug that the `./spark-submit` script did not work for running with `local` master. It didn't work because the executor classloader did not properly delegate to the context class loader (if it is defined) and in local mode the context class loader is set by the `./spark-submit` script. A unit test is added for that case.
Author: Patrick Wendell <pwendell@gmail.com>
Closes #398 from pwendell/class-loaders and squashes the following commits:
b4a1a58 [Patrick Wendell] Minor clean up
14f1272 [Patrick Wendell] SPARK-1480: Clean up use of classloaders
Diffstat (limited to 'sql/catalyst')
-rw-r--r-- | sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/package.scala | 4 |
1 files changed, 3 insertions, 1 deletions
diff --git a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/package.scala b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/package.scala index a001d95359..49fc4f70fd 100644 --- a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/package.scala +++ b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/package.scala @@ -19,6 +19,8 @@ package org.apache.spark.sql.catalyst import java.io.{PrintWriter, ByteArrayOutputStream, FileInputStream, File} +import org.apache.spark.util.{Utils => SparkUtils} + package object util { /** * Returns a path to a temporary file that probably does not exist. @@ -54,7 +56,7 @@ package object util { def resourceToString( resource:String, encoding: String = "UTF-8", - classLoader: ClassLoader = this.getClass.getClassLoader) = { + classLoader: ClassLoader = SparkUtils.getSparkClassLoader) = { val inStream = classLoader.getResourceAsStream(resource) val outStream = new ByteArrayOutputStream try { |