aboutsummaryrefslogtreecommitdiff
path: root/network
diff options
context:
space:
mode:
authorMark Hamstra <markhamstra@gmail.com>2015-11-25 09:34:34 -0600
committerImran Rashid <irashid@cloudera.com>2015-11-25 09:34:34 -0600
commit0a5aef753e70e93d7e56054f354a52e4d4e18932 (patch)
treea079c2538fe5f91b76eb2630e351e5a99b7d37f3 /network
parentb9b6fbe89b6d1a890faa02c1a53bb670a6255362 (diff)
downloadspark-0a5aef753e70e93d7e56054f354a52e4d4e18932.tar.gz
spark-0a5aef753e70e93d7e56054f354a52e4d4e18932.tar.bz2
spark-0a5aef753e70e93d7e56054f354a52e4d4e18932.zip
[SPARK-10666][SPARK-6880][CORE] Use properties from ActiveJob associated with a Stage
This issue was addressed in https://github.com/apache/spark/pull/5494, but the fix in that PR, while safe in the sense that it will prevent the SparkContext from shutting down, misses the actual bug. The intent of `submitMissingTasks` should be understood as "submit the Tasks that are missing for the Stage, and run them as part of the ActiveJob identified by jobId". Because of a long-standing bug, the `jobId` parameter was never being used. Instead, we were trying to use the jobId with which the Stage was created -- which may no longer exist as an ActiveJob, hence the crash reported in SPARK-6880. The correct fix is to use the ActiveJob specified by the supplied jobId parameter, which is guaranteed to exist at the call sites of submitMissingTasks. This fix should be applied to all maintenance branches, since it has existed since 1.0. kayousterhout pankajarora12 Author: Mark Hamstra <markhamstra@gmail.com> Author: Imran Rashid <irashid@cloudera.com> Closes #6291 from markhamstra/SPARK-6880.
Diffstat (limited to 'network')
0 files changed, 0 insertions, 0 deletions