diff options
author | Parag Chaudhari <paragpc@amazon.com> | 2017-01-24 08:41:46 -0600 |
---|---|---|
committer | Imran Rashid <irashid@cloudera.com> | 2017-01-24 08:41:46 -0600 |
commit | 0ff67a1cf91ce4a36657c789c0fe676f4f89282f (patch) | |
tree | 52b3c6e83733a8b38cddc8ced32ba535e399da50 /sql/hive | |
parent | 752502be053c66a95b04204b4ae0e9574394bc58 (diff) | |
download | spark-0ff67a1cf91ce4a36657c789c0fe676f4f89282f.tar.gz spark-0ff67a1cf91ce4a36657c789c0fe676f4f89282f.tar.bz2 spark-0ff67a1cf91ce4a36657c789c0fe676f4f89282f.zip |
[SPARK-14049][CORE] Add functionality in spark history sever API to query applications by end time
## What changes were proposed in this pull request?
Currently, spark history server REST API provides functionality to query applications by application start time range based on minDate and maxDate query parameters, but it lacks support to query applications by their end time. In this pull request we are proposing optional minEndDate and maxEndDate query parameters and filtering capability based on these parameters to spark history server REST API. This functionality can be used for following queries,
1. Applications finished in last 'x' minutes
2. Applications finished before 'y' time
3. Applications finished between 'x' time to 'y' time
4. Applications started from 'x' time and finished before 'y' time.
For backward compatibility, we can keep existing minDate and maxDate query parameters as they are and they can continue support filtering based on start time range.
## How was this patch tested?
Existing unit tests and 4 new unit tests.
Author: Parag Chaudhari <paragpc@amazon.com>
Closes #11867 from paragpc/master-SHS-query-by-endtime_2.
Diffstat (limited to 'sql/hive')
0 files changed, 0 insertions, 0 deletions