diff options
author | Devaraj K <devaraj@apache.org> | 2016-05-30 14:29:27 -0700 |
---|---|---|
committer | Kay Ousterhout <kayousterhout@gmail.com> | 2016-05-30 14:29:27 -0700 |
commit | 5b21139dbf3bd09cb3a590bd0ffb857ea92dc23c (patch) | |
tree | daa81d932fd5cc1d3ababd6eb86affe27c941699 /core/src/main/resources | |
parent | 2d34183b273af1125181f04c49725efc2fa351af (diff) | |
download | spark-5b21139dbf3bd09cb3a590bd0ffb857ea92dc23c.tar.gz spark-5b21139dbf3bd09cb3a590bd0ffb857ea92dc23c.tar.bz2 spark-5b21139dbf3bd09cb3a590bd0ffb857ea92dc23c.zip |
[SPARK-10530][CORE] Kill other task attempts when one taskattempt belonging the same task is succeeded in speculation
## What changes were proposed in this pull request?
With this patch, TaskSetManager kills other running attempts when any one of the attempt succeeds for the same task. Also killed tasks will not be considered as failed tasks and they get listed separately in the UI and also shows the task state as KILLED instead of FAILED.
## How was this patch tested?
core\src\test\scala\org\apache\spark\ui\jobs\JobProgressListenerSuite.scala
core\src\test\scala\org\apache\spark\util\JsonProtocolSuite.scala
I have verified this patch manually by enabling spark.speculation as true, when any attempt gets succeeded then other running attempts are getting killed for the same task and other pending tasks are getting assigned in those. And also when any attempt gets killed then they are considered as KILLED tasks and not considered as FAILED tasks. Please find the attached screen shots for the reference.
![stage-tasks-table](https://cloud.githubusercontent.com/assets/3174804/14075132/394c6a12-f4f4-11e5-8638-20ff7b8cc9bc.png)
![stages-table](https://cloud.githubusercontent.com/assets/3174804/14075134/3b60f412-f4f4-11e5-9ea6-dd0dcc86eb03.png)
Ref : https://github.com/apache/spark/pull/11916
Author: Devaraj K <devaraj@apache.org>
Closes #11996 from devaraj-kavali/SPARK-10530.
Diffstat (limited to 'core/src/main/resources')
0 files changed, 0 insertions, 0 deletions