aboutsummaryrefslogtreecommitdiff
path: root/core
diff options
context:
space:
mode:
authorBharath Bhushan <manku.timma@outlook.com>2014-04-12 20:52:29 -0700
committerPatrick Wendell <pwendell@gmail.com>2014-04-12 20:53:44 -0700
commitca11919e6e97a62eb3e3ce882ffa29eae36f50f7 (patch)
treeae0b14fbfe14ff6d676086014516c4911c7d6842 /core
parentc2d160fbee2ef90a7683d9771f2f632b68d74aef (diff)
downloadspark-ca11919e6e97a62eb3e3ce882ffa29eae36f50f7.tar.gz
spark-ca11919e6e97a62eb3e3ce882ffa29eae36f50f7.tar.bz2
spark-ca11919e6e97a62eb3e3ce882ffa29eae36f50f7.zip
[SPARK-1403] Move the class loader creation back to where it was in 0.9.0
[SPARK-1403] I investigated why spark 0.9.0 loads fine on mesos while spark 1.0.0 fails. What I found was that in SparkEnv.scala, while creating the SparkEnv object, the current thread's classloader is null. But in 0.9.0, at the same place, it is set to org.apache.spark.repl.ExecutorClassLoader . I saw that https://github.com/apache/spark/commit/7edbea41b43e0dc11a2de156be220db8b7952d01 moved it to it current place. I moved it back and saw that 1.0.0 started working fine on mesos. I just created a minimal patch that allows me to run spark on mesos correctly. It seems like SecurityManager's creation needs to be taken into account for a correct fix. Also moving the creation of the serializer out of SparkEnv might be a part of the right solution. PTAL. Author: Bharath Bhushan <manku.timma@outlook.com> Closes #322 from manku-timma/spark-1403 and squashes the following commits: 606c2b9 [Bharath Bhushan] Merge remote-tracking branch 'upstream/master' into spark-1403 ec8f870 [Bharath Bhushan] revert the logger change for java 6 compatibility as PR 334 is doing it 728beca [Bharath Bhushan] Merge remote-tracking branch 'upstream/master' into spark-1403 044027d [Bharath Bhushan] fix compile error 6f260a4 [Bharath Bhushan] Merge remote-tracking branch 'upstream/master' into spark-1403 b3a053f [Bharath Bhushan] Merge remote-tracking branch 'upstream/master' into spark-1403 04b9662 [Bharath Bhushan] add missing line 4803c19 [Bharath Bhushan] Merge remote-tracking branch 'upstream/master' into spark-1403 f3c9a14 [Bharath Bhushan] Merge remote-tracking branch 'upstream/master' into spark-1403 42d3d6a [Bharath Bhushan] used code fragment from @ueshin to fix the problem in a better way 89109d7 [Bharath Bhushan] move the class loader creation back to where it was in 0.9.0
Diffstat (limited to 'core')
-rw-r--r--core/src/main/scala/org/apache/spark/executor/MesosExecutorBackend.scala22
1 files changed, 15 insertions, 7 deletions
diff --git a/core/src/main/scala/org/apache/spark/executor/MesosExecutorBackend.scala b/core/src/main/scala/org/apache/spark/executor/MesosExecutorBackend.scala
index 6fc702fdb1..df36a06485 100644
--- a/core/src/main/scala/org/apache/spark/executor/MesosExecutorBackend.scala
+++ b/core/src/main/scala/org/apache/spark/executor/MesosExecutorBackend.scala
@@ -50,13 +50,21 @@ private[spark] class MesosExecutorBackend
executorInfo: ExecutorInfo,
frameworkInfo: FrameworkInfo,
slaveInfo: SlaveInfo) {
- logInfo("Registered with Mesos as executor ID " + executorInfo.getExecutorId.getValue)
- this.driver = driver
- val properties = Utils.deserialize[Array[(String, String)]](executorInfo.getData.toByteArray)
- executor = new Executor(
- executorInfo.getExecutorId.getValue,
- slaveInfo.getHostname,
- properties)
+ val cl = Thread.currentThread.getContextClassLoader
+ try {
+ // Work around for SPARK-1480
+ Thread.currentThread.setContextClassLoader(getClass.getClassLoader)
+ logInfo("Registered with Mesos as executor ID " + executorInfo.getExecutorId.getValue)
+ this.driver = driver
+ val properties = Utils.deserialize[Array[(String, String)]](executorInfo.getData.toByteArray)
+ executor = new Executor(
+ executorInfo.getExecutorId.getValue,
+ slaveInfo.getHostname,
+ properties)
+ } finally {
+ // Work around for SPARK-1480
+ Thread.currentThread.setContextClassLoader(cl)
+ }
}
override def launchTask(d: ExecutorDriver, taskInfo: TaskInfo) {