aboutsummaryrefslogtreecommitdiff
path: root/core/src/main/scala/org/apache/spark/util/Utils.scala
diff options
context:
space:
mode:
authorShixiong Zhu <shixiong@databricks.com>2016-12-08 11:54:04 -0800
committerShixiong Zhu <shixiong@databricks.com>2016-12-08 11:54:04 -0800
commit26432df9cc6ffe569583aa628c6ecd7050b38316 (patch)
treea5dd7d05716e88c44b791d267c40e058460089f0 /core/src/main/scala/org/apache/spark/util/Utils.scala
parentc3d3a9d0e85b834abef87069e4edd27db87fc607 (diff)
downloadspark-26432df9cc6ffe569583aa628c6ecd7050b38316.tar.gz
spark-26432df9cc6ffe569583aa628c6ecd7050b38316.tar.bz2
spark-26432df9cc6ffe569583aa628c6ecd7050b38316.zip
[SPARK-18751][CORE] Fix deadlock when SparkContext.stop is called in Utils.tryOrStopSparkContext
## What changes were proposed in this pull request? When `SparkContext.stop` is called in `Utils.tryOrStopSparkContext` (the following three places), it will cause deadlock because the `stop` method needs to wait for the thread running `stop` to exit. - ContextCleaner.keepCleaning - LiveListenerBus.listenerThread.run - TaskSchedulerImpl.start This PR adds `SparkContext.stopInNewThread` and uses it to eliminate the potential deadlock. I also removed my changes in #15775 since they are not necessary now. ## How was this patch tested? Jenkins Author: Shixiong Zhu <shixiong@databricks.com> Closes #16178 from zsxwing/fix-stop-deadlock.
Diffstat (limited to 'core/src/main/scala/org/apache/spark/util/Utils.scala')
-rw-r--r--core/src/main/scala/org/apache/spark/util/Utils.scala2
1 files changed, 1 insertions, 1 deletions
diff --git a/core/src/main/scala/org/apache/spark/util/Utils.scala b/core/src/main/scala/org/apache/spark/util/Utils.scala
index 91f5606127..c6ad154167 100644
--- a/core/src/main/scala/org/apache/spark/util/Utils.scala
+++ b/core/src/main/scala/org/apache/spark/util/Utils.scala
@@ -1249,7 +1249,7 @@ private[spark] object Utils extends Logging {
val currentThreadName = Thread.currentThread().getName
if (sc != null) {
logError(s"uncaught error in thread $currentThreadName, stopping SparkContext", t)
- sc.stop()
+ sc.stopInNewThread()
}
if (!NonFatal(t)) {
logError(s"throw uncaught fatal error in thread $currentThreadName", t)