aboutsummaryrefslogtreecommitdiff
path: root/yarn
diff options
context:
space:
mode:
authorMarcelo Vanzin <vanzin@cloudera.com>2015-04-27 19:46:17 -0400
committerSean Owen <sowen@cloudera.com>2015-04-27 19:46:17 -0400
commit5d45e1f60059e2f2fc8ad64778b9ddcc8887c570 (patch)
tree149be28032b602497d6d8d7adf03f965448db669 /yarn
parent8e1c00dbf4b60962908626dead744e5d73c8085e (diff)
downloadspark-5d45e1f60059e2f2fc8ad64778b9ddcc8887c570.tar.gz
spark-5d45e1f60059e2f2fc8ad64778b9ddcc8887c570.tar.bz2
spark-5d45e1f60059e2f2fc8ad64778b9ddcc8887c570.zip
[SPARK-3090] [CORE] Stop SparkContext if user forgets to.
Set up a shutdown hook to try to stop the Spark context in case the user forgets to do it. The main effect is that any open logs files are flushed and closed, which is particularly interesting for event logs. Author: Marcelo Vanzin <vanzin@cloudera.com> Closes #5696 from vanzin/SPARK-3090 and squashes the following commits: 3b554b5 [Marcelo Vanzin] [SPARK-3090] [core] Stop SparkContext if user forgets to.
Diffstat (limited to 'yarn')
-rw-r--r--yarn/src/main/scala/org/apache/spark/deploy/yarn/ApplicationMaster.scala10
1 files changed, 2 insertions, 8 deletions
diff --git a/yarn/src/main/scala/org/apache/spark/deploy/yarn/ApplicationMaster.scala b/yarn/src/main/scala/org/apache/spark/deploy/yarn/ApplicationMaster.scala
index 93ae45133c..70cb57ffd8 100644
--- a/yarn/src/main/scala/org/apache/spark/deploy/yarn/ApplicationMaster.scala
+++ b/yarn/src/main/scala/org/apache/spark/deploy/yarn/ApplicationMaster.scala
@@ -95,14 +95,8 @@ private[spark] class ApplicationMaster(
val fs = FileSystem.get(yarnConf)
- Utils.addShutdownHook { () =>
- // If the SparkContext is still registered, shut it down as a best case effort in case
- // users do not call sc.stop or do System.exit().
- val sc = sparkContextRef.get()
- if (sc != null) {
- logInfo("Invoking sc stop from shutdown hook")
- sc.stop()
- }
+ // This shutdown hook should run *after* the SparkContext is shut down.
+ Utils.addShutdownHook(Utils.SPARK_CONTEXT_SHUTDOWN_PRIORITY - 1) { () =>
val maxAppAttempts = client.getMaxRegAttempts(sparkConf, yarnConf)
val isLastAttempt = client.getAttemptId().getAttemptId() >= maxAppAttempts