aboutsummaryrefslogtreecommitdiff
path: root/python/pyspark/ml/recommendation.py
diff options
context:
space:
mode:
authorTathagata Das <tathagata.das1565@gmail.com>2015-05-28 22:28:13 -0700
committerPatrick Wendell <patrick@databricks.com>2015-05-28 22:28:13 -0700
commitcd3d9a5c0c3e77098a72c85dffe4a27737009ae7 (patch)
tree3c39655f6c133b66341a7524e9a70ac1ac3e9f2e /python/pyspark/ml/recommendation.py
parent04ddcd4db7801abefa9c9effe5d88413b29d713b (diff)
downloadspark-cd3d9a5c0c3e77098a72c85dffe4a27737009ae7.tar.gz
spark-cd3d9a5c0c3e77098a72c85dffe4a27737009ae7.tar.bz2
spark-cd3d9a5c0c3e77098a72c85dffe4a27737009ae7.zip
[SPARK-7930] [CORE] [STREAMING] Fixed shutdown hook priorities
Shutdown hook for temp directories had priority 100 while SparkContext was 50. So the local root directory was deleted before SparkContext was shutdown. This leads to scary errors on running jobs, at the time of shutdown. This is especially a problem when running streaming examples, where Ctrl-C is the only way to shutdown. The fix in this PR is to make the temp directory shutdown priority lower than SparkContext, so that the temp dirs are the last thing to get deleted, after the SparkContext has been shut down. Also, the DiskBlockManager shutdown priority is change from default 100 to temp_dir_prio + 1, so that it gets invoked just before all temp dirs are cleared. Author: Tathagata Das <tathagata.das1565@gmail.com> Closes #6482 from tdas/SPARK-7930 and squashes the following commits: d7cbeb5 [Tathagata Das] Removed unnecessary line 1514d0b [Tathagata Das] Fixed shutdown hook priorities
Diffstat (limited to 'python/pyspark/ml/recommendation.py')
0 files changed, 0 insertions, 0 deletions