aboutsummaryrefslogtreecommitdiff
path: root/core
diff options
context:
space:
mode:
authorIulian Dragos <jaguarul@gmail.com>2015-10-30 16:51:32 +0000
committerSean Owen <sowen@cloudera.com>2015-10-30 16:51:32 +0000
commit0451b00148a294c665146563242d2fe2de943a02 (patch)
tree552673be7d79fca3c76b596d89748ce892066116 /core
parent14d08b99085d4e609aeae0cf54d4584e860eb552 (diff)
downloadspark-0451b00148a294c665146563242d2fe2de943a02.tar.gz
spark-0451b00148a294c665146563242d2fe2de943a02.tar.bz2
spark-0451b00148a294c665146563242d2fe2de943a02.zip
[SPARK-10986][MESOS] Set the context class loader in the Mesos executor backend.
See [SPARK-10986](https://issues.apache.org/jira/browse/SPARK-10986) for details. This fixes the `ClassNotFoundException` for Spark classes in the serializer. I am not sure this is the right way to handle the class loader, but I couldn't find any documentation on how the context class loader is used and who relies on it. It seems at least the serializer uses it to instantiate classes during deserialization. I am open to suggestions (I tried this fix on a real Mesos cluster and it *does* fix the issue). tnachen andrewor14 Author: Iulian Dragos <jaguarul@gmail.com> Closes #9282 from dragos/issue/mesos-classloader.
Diffstat (limited to 'core')
-rw-r--r--core/src/main/scala/org/apache/spark/executor/MesosExecutorBackend.scala5
1 files changed, 5 insertions, 0 deletions
diff --git a/core/src/main/scala/org/apache/spark/executor/MesosExecutorBackend.scala b/core/src/main/scala/org/apache/spark/executor/MesosExecutorBackend.scala
index 0474fd2ccc..c9f18ebc7f 100644
--- a/core/src/main/scala/org/apache/spark/executor/MesosExecutorBackend.scala
+++ b/core/src/main/scala/org/apache/spark/executor/MesosExecutorBackend.scala
@@ -63,6 +63,11 @@ private[spark] class MesosExecutorBackend
logInfo(s"Registered with Mesos as executor ID $executorId with $cpusPerTask cpus")
this.driver = driver
+ // Set a context class loader to be picked up by the serializer. Without this call
+ // the serializer would default to the null class loader, and fail to find Spark classes
+ // See SPARK-10986.
+ Thread.currentThread().setContextClassLoader(this.getClass.getClassLoader)
+
val properties = Utils.deserialize[Array[(String, String)]](executorInfo.getData.toByteArray) ++
Seq[(String, String)](("spark.app.id", frameworkInfo.getId.getValue))
val conf = new SparkConf(loadDefaults = true).setAll(properties)