diff options
author | Patrick Wendell <pwendell@gmail.com> | 2014-04-13 08:58:37 -0700 |
---|---|---|
committer | Patrick Wendell <pwendell@gmail.com> | 2014-04-13 08:58:37 -0700 |
commit | 4bc07eebbf5e2ea0c0b6f1642049515025d88d07 (patch) | |
tree | fc314a1c1d68055b04cdc37553669ea5f12c628b /repl/src | |
parent | ca11919e6e97a62eb3e3ce882ffa29eae36f50f7 (diff) | |
download | spark-4bc07eebbf5e2ea0c0b6f1642049515025d88d07.tar.gz spark-4bc07eebbf5e2ea0c0b6f1642049515025d88d07.tar.bz2 spark-4bc07eebbf5e2ea0c0b6f1642049515025d88d07.zip |
SPARK-1480: Clean up use of classloaders
The Spark codebase is a bit fast-and-loose when accessing classloaders and this has caused a few bugs to surface in master.
This patch defines some utility methods for accessing classloaders. This makes the intention when accessing a classloader much more explicit in the code and fixes a few cases where the wrong one was chosen.
case (a) -> We want the classloader that loaded Spark
case (b) -> We want the context class loader, or if not present, we want (a)
This patch provides a better fix for SPARK-1403 (https://issues.apache.org/jira/browse/SPARK-1403) than the current work around, which it reverts. It also fixes a previously unreported bug that the `./spark-submit` script did not work for running with `local` master. It didn't work because the executor classloader did not properly delegate to the context class loader (if it is defined) and in local mode the context class loader is set by the `./spark-submit` script. A unit test is added for that case.
Author: Patrick Wendell <pwendell@gmail.com>
Closes #398 from pwendell/class-loaders and squashes the following commits:
b4a1a58 [Patrick Wendell] Minor clean up
14f1272 [Patrick Wendell] SPARK-1480: Clean up use of classloaders
Diffstat (limited to 'repl/src')
-rw-r--r-- | repl/src/main/scala/org/apache/spark/repl/SparkILoop.scala | 7 |
1 files changed, 4 insertions, 3 deletions
diff --git a/repl/src/main/scala/org/apache/spark/repl/SparkILoop.scala b/repl/src/main/scala/org/apache/spark/repl/SparkILoop.scala index 5a367b6bb7..beb40e8702 100644 --- a/repl/src/main/scala/org/apache/spark/repl/SparkILoop.scala +++ b/repl/src/main/scala/org/apache/spark/repl/SparkILoop.scala @@ -39,6 +39,7 @@ import scala.reflect.api.{Mirror, TypeCreator, Universe => ApiUniverse} import org.apache.spark.Logging import org.apache.spark.SparkConf import org.apache.spark.SparkContext +import org.apache.spark.util.Utils /** The Scala interactive shell. It provides a read-eval-print loop * around the Interpreter class. @@ -130,7 +131,7 @@ class SparkILoop(in0: Option[BufferedReader], protected val out: JPrintWriter, def history = in.history /** The context class loader at the time this object was created */ - protected val originalClassLoader = Thread.currentThread.getContextClassLoader + protected val originalClassLoader = Utils.getContextOrSparkClassLoader // classpath entries added via :cp var addedClasspath: String = "" @@ -177,7 +178,7 @@ class SparkILoop(in0: Option[BufferedReader], protected val out: JPrintWriter, override lazy val formatting = new Formatting { def prompt = SparkILoop.this.prompt } - override protected def parentClassLoader = SparkHelper.explicitParentLoader(settings).getOrElse(classOf[SparkILoop].getClassLoader) + override protected def parentClassLoader = SparkHelper.explicitParentLoader(settings).getOrElse(classOf[SparkILoop].getClassLoader) } /** Create a new interpreter. */ @@ -871,7 +872,7 @@ class SparkILoop(in0: Option[BufferedReader], protected val out: JPrintWriter, } val u: scala.reflect.runtime.universe.type = scala.reflect.runtime.universe - val m = u.runtimeMirror(getClass.getClassLoader) + val m = u.runtimeMirror(Utils.getSparkClassLoader) private def tagOfStaticClass[T: ClassTag]: u.TypeTag[T] = u.TypeTag[T]( m, |