diff options
author | Aaron Davidson <aaron@databricks.com> | 2014-05-15 21:37:58 -0700 |
---|---|---|
committer | Patrick Wendell <pwendell@gmail.com> | 2014-05-15 21:37:58 -0700 |
commit | bb98ecafce196ecc5bc3a1e4cc9264df7b752c6a (patch) | |
tree | afd827c08e53d63fa3fd0d594c673b4226e05be1 /core | |
parent | 94c5139607ec876782e594012a108ebf55fa97db (diff) | |
download | spark-bb98ecafce196ecc5bc3a1e4cc9264df7b752c6a.tar.gz spark-bb98ecafce196ecc5bc3a1e4cc9264df7b752c6a.tar.bz2 spark-bb98ecafce196ecc5bc3a1e4cc9264df7b752c6a.zip |
SPARK-1860: Do not cleanup application work/ directories by default
This causes an unrecoverable error for applications that are running for longer
than 7 days that have jars added to the SparkContext, as the jars are cleaned up
even though the application is still running.
Author: Aaron Davidson <aaron@databricks.com>
Closes #800 from aarondav/shitty-defaults and squashes the following commits:
a573fbb [Aaron Davidson] SPARK-1860: Do not cleanup application work/ directories by default
Diffstat (limited to 'core')
-rwxr-xr-x | core/src/main/scala/org/apache/spark/deploy/worker/Worker.scala | 2 |
1 files changed, 1 insertions, 1 deletions
diff --git a/core/src/main/scala/org/apache/spark/deploy/worker/Worker.scala b/core/src/main/scala/org/apache/spark/deploy/worker/Worker.scala index 134624c35a..fb9cc116cd 100755 --- a/core/src/main/scala/org/apache/spark/deploy/worker/Worker.scala +++ b/core/src/main/scala/org/apache/spark/deploy/worker/Worker.scala @@ -65,7 +65,7 @@ private[spark] class Worker( val REGISTRATION_TIMEOUT = 20.seconds val REGISTRATION_RETRIES = 3 - val CLEANUP_ENABLED = conf.getBoolean("spark.worker.cleanup.enabled", true) + val CLEANUP_ENABLED = conf.getBoolean("spark.worker.cleanup.enabled", false) // How often worker will clean up old app folders val CLEANUP_INTERVAL_MILLIS = conf.getLong("spark.worker.cleanup.interval", 60 * 30) * 1000 // TTL for app folders/data; after TTL expires it will be cleaned up |