aboutsummaryrefslogtreecommitdiff
path: root/core/src/test/resources/HistoryServerExpectations/applications/local-1422981780767
diff options
context:
space:
mode:
authorImran Rashid <irashid@cloudera.com>2015-05-05 07:25:40 -0500
committerImran Rashid <irashid@cloudera.com>2015-05-05 07:25:40 -0500
commitd49735800db27239c11478aac4b0f2ec9df91a3f (patch)
treeb70111993f4c8fb8913987b5b1d7dae080d26190 /core/src/test/resources/HistoryServerExpectations/applications/local-1422981780767
parent51f462003b416eac92feb5a6725f6c2994389010 (diff)
downloadspark-d49735800db27239c11478aac4b0f2ec9df91a3f.tar.gz
spark-d49735800db27239c11478aac4b0f2ec9df91a3f.tar.bz2
spark-d49735800db27239c11478aac4b0f2ec9df91a3f.zip
[SPARK-3454] separate json endpoints for data in the UI
Exposes data available in the UI as json over http. Key points: * new endpoints, handled independently of existing XyzPage classes. Root entrypoint is `JsonRootResource` * Uses jersey + jackson for routing & converting POJOs into json * tests against known results in `HistoryServerSuite` * also fixes some minor issues w/ the UI -- synchronizing on access to `StorageListener` & `StorageStatusListener`, and fixing some inconsistencies w/ the way we handle retained jobs & stages. Author: Imran Rashid <irashid@cloudera.com> Closes #4435 from squito/SPARK-3454 and squashes the following commits: da1e35f [Imran Rashid] typos etc. 5e78b4f [Imran Rashid] fix rendering problems 5ae02ad [Imran Rashid] Merge branch 'master' into SPARK-3454 f016182 [Imran Rashid] change all constructors json-pojo class constructors to be private[spark] to protect us from mima-false-positives if we add fields 3347b72 [Imran Rashid] mark EnumUtil as @Private ec140a2 [Imran Rashid] create @Private cc1febf [Imran Rashid] add docs on the metrics-as-json api cbaf287 [Imran Rashid] Merge branch 'master' into SPARK-3454 56db31e [Imran Rashid] update tests for mulit-attempt 7f3bc4e [Imran Rashid] Revert "add sbt-revolved plugin, to make it easier to start & stop http servers in sbt" 67008b4 [Imran Rashid] rats 9e51400 [Imran Rashid] style c9bae1c [Imran Rashid] handle multiple attempts per app b87cd63 [Imran Rashid] add sbt-revolved plugin, to make it easier to start & stop http servers in sbt 188762c [Imran Rashid] multi-attempt 2af11e5 [Imran Rashid] Merge branch 'master' into SPARK-3454 befff0c [Imran Rashid] review feedback 14ac3ed [Imran Rashid] jersey-core needs to be explicit; move version & scope to parent pom.xml f90680e [Imran Rashid] Merge branch 'master' into SPARK-3454 dc8a7fe [Imran Rashid] style, fix errant comments acb7ef6 [Imran Rashid] fix indentation 7bf1811 [Imran Rashid] move MetricHelper so mima doesnt think its exposed; comments 9d889d6 [Imran Rashid] undo some unnecessary changes f48a7b0 [Imran Rashid] docs 52bbae8 [Imran Rashid] StorageListener & StorageStatusListener needs to synchronize internally to be thread-safe 31c79ce [Imran Rashid] asm no longer needed for SPARK_PREPEND_CLASSES b2f8b91 [Imran Rashid] @DeveloperApi 2e19be2 [Imran Rashid] lazily convert ApplicationInfo to avoid memory overhead ba3d9d2 [Imran Rashid] upper case enums 39ac29c [Imran Rashid] move EnumUtil d2bde77 [Imran Rashid] update error handling & scoping 4a234d3 [Imran Rashid] avoid jersey-media-json-jackson b/c of potential version conflicts a157a2f [Imran Rashid] style 7bd4d15 [Imran Rashid] delete security test, since it doesnt do anything a325563 [Imran Rashid] style a9c5cf1 [Imran Rashid] undo changes superceeded by master 0c6f968 [Imran Rashid] update deps 1ed0d07 [Imran Rashid] Merge branch 'master' into SPARK-3454 4c92af6 [Imran Rashid] style f2e63ad [Imran Rashid] Merge branch 'master' into SPARK-3454 c22b11f [Imran Rashid] fix compile error 9ea682c [Imran Rashid] go back to good ol' java enums cf86175 [Imran Rashid] style d493b38 [Imran Rashid] Merge branch 'master' into SPARK-3454 f05ae89 [Imran Rashid] add in ExecutorSummaryInfo for MiMa :( 101a698 [Imran Rashid] style d2ef58d [Imran Rashid] revert changes that had HistoryServer refresh the application listing more often b136e39b [Imran Rashid] Revert "add sbt-revolved plugin, to make it easier to start & stop http servers in sbt" e031719 [Imran Rashid] fixes from review 1f53a66 [Imran Rashid] style b4a7863 [Imran Rashid] fix compile error 2c8b7ee [Imran Rashid] rats 1578a4a [Imran Rashid] doc 674f8dc [Imran Rashid] more explicit about total numbers of jobs & stages vs. number retained 9922be0 [Imran Rashid] Merge branch 'master' into stage_distributions f5a5196 [Imran Rashid] undo removal of renderJson from MasterPage, since there is no substitute yet db61211 [Imran Rashid] get JobProgressListener directly from UI fdfc181 [Imran Rashid] stage/taskList 63eb4a6 [Imran Rashid] tests for taskSummary ad27de8 [Imran Rashid] error handling on quantile values b2efcaf [Imran Rashid] cleanup, combine stage-related paths into one resource aaba896 [Imran Rashid] wire up task summary a4b1397 [Imran Rashid] stage metric distributions e48ba32 [Imran Rashid] rename eaf3bbb [Imran Rashid] style 25cd894 [Imran Rashid] if only given day, assume GMT 51eaedb [Imran Rashid] more visibility fixes 9f28b7e [Imran Rashid] ack, more cleanup 99764e1 [Imran Rashid] Merge branch 'SPARK-3454_w_jersey' into SPARK-3454 a61a43c [Imran Rashid] oops, remove accidental checkin a066055 [Imran Rashid] set visibility on a lot of classes 1f361c8 [Imran Rashid] update rat-excludes 0be5120 [Imran Rashid] Merge branch 'master' into SPARK-3454_w_jersey 2382bef [Imran Rashid] switch to using new "enum" fef6605 [Imran Rashid] some utils for working w/ new "enum" format dbfc7bf [Imran Rashid] style b86bcb0 [Imran Rashid] update test to look at one stage attempt 5f9df24 [Imran Rashid] style 7fd156a [Imran Rashid] refactor jsonDiff to avoid code duplication 73f1378 [Imran Rashid] test json; also add test cases for cleaned stages & jobs 97d411f [Imran Rashid] json endpoint for one job 0c96147 [Imran Rashid] better error msgs for bad stageId vs bad attemptId dddbd29 [Imran Rashid] stages have attempt; jobs are sorted; resource for all attempts for one stage 190c17a [Imran Rashid] StagePage should distinguish no task data, from unknown stage 84cd497 [Imran Rashid] AllJobsPage should still report correct completed & failed job count, even if some have been cleaned, to make it consistent w/ AllStagesPage 36e4062 [Imran Rashid] SparkUI needs to know about startTime, so it can list its own applicationInfo b4c75ed [Imran Rashid] fix merge conflicts; need to widen visibility in a few cases e91750a [Imran Rashid] Merge branch 'master' into SPARK-3454_w_jersey 56d2fc7 [Imran Rashid] jersey needs asm for SPARK_PREPEND_CLASSES to work f7df095 [Imran Rashid] add test for accumulables, and discover that I need update after all 9c0c125 [Imran Rashid] add accumulableInfo 00e9cc5 [Imran Rashid] more style 3377e61 [Imran Rashid] scaladoc d05f7a9 [Imran Rashid] dont use case classes for status api POJOs, since they have binary compatibility issues 654cecf [Imran Rashid] move all the status api POJOs to one file b86e2b0 [Imran Rashid] style 18a8c45 [Imran Rashid] Merge branch 'master' into SPARK-3454_w_jersey 5598f19 [Imran Rashid] delete some unnecessary code, more to go 56edce0 [Imran Rashid] style 017c755 [Imran Rashid] add in metrics now available 1b78cb7 [Imran Rashid] fix some import ordering 0dc3ea7 [Imran Rashid] if app isnt found, reload apps from FS before giving up c7d884f [Imran Rashid] fix merge conflicts 0c12b50 [Imran Rashid] Merge branch 'master' into SPARK-3454_w_jersey b6a96a8 [Imran Rashid] compare json by AST, not string cd37845 [Imran Rashid] switch to using java.util.Dates for times a4ab5aa [Imran Rashid] add in explicit dependency on jersey 1.9 -- maven wasn't happy before this 4fdc39f [Imran Rashid] refactor case insensitive enum parsing cba1ef6 [Imran Rashid] add security (maybe?) for metrics json f0264a7 [Imran Rashid] switch to using jersey for metrics json bceb3a9 [Imran Rashid] set http response code on error, some testing e0356b6 [Imran Rashid] put new test expectation files in rat excludes (is this OK?) b252e7a [Imran Rashid] small cleanup of accidental changes d1a8c92 [Imran Rashid] add sbt-revolved plugin, to make it easier to start & stop http servers in sbt 4b398d0 [Imran Rashid] expose UI data as json in new endpoints
Diffstat (limited to 'core/src/test/resources/HistoryServerExpectations/applications/local-1422981780767')
-rw-r--r--core/src/test/resources/HistoryServerExpectations/applications/local-1422981780767/executors/json_expectation17
-rw-r--r--core/src/test/resources/HistoryServerExpectations/applications/local-1422981780767/jobs/0/json_expectation15
-rw-r--r--core/src/test/resources/HistoryServerExpectations/applications/local-1422981780767/jobs/json_expectation43
-rw-r--r--core/src/test/resources/HistoryServerExpectations/applications/local-1422981780767/jobs?status=succeeded&status=failed/json_expectation43
-rw-r--r--core/src/test/resources/HistoryServerExpectations/applications/local-1422981780767/jobs?status=succeeded/json_expectation29
-rw-r--r--core/src/test/resources/HistoryServerExpectations/applications/local-1422981780767/json_expectation10
-rw-r--r--core/src/test/resources/HistoryServerExpectations/applications/local-1422981780767/stages/1/0/json_expectation270
-rw-r--r--core/src/test/resources/HistoryServerExpectations/applications/local-1422981780767/stages/1/json_expectation270
-rw-r--r--core/src/test/resources/HistoryServerExpectations/applications/local-1422981780767/stages/json_expectation89
-rw-r--r--core/src/test/resources/HistoryServerExpectations/applications/local-1422981780767/stages?status=complete/json_expectation67
-rw-r--r--core/src/test/resources/HistoryServerExpectations/applications/local-1422981780767/stages?status=failed/json_expectation23
-rw-r--r--core/src/test/resources/HistoryServerExpectations/applications/local-1422981780767/storage/rdd/0/json_expectation64
-rw-r--r--core/src/test/resources/HistoryServerExpectations/applications/local-1422981780767/storage/rdd/json_expectation9
13 files changed, 949 insertions, 0 deletions
diff --git a/core/src/test/resources/HistoryServerExpectations/applications/local-1422981780767/executors/json_expectation b/core/src/test/resources/HistoryServerExpectations/applications/local-1422981780767/executors/json_expectation
new file mode 100644
index 0000000000..cb622e1472
--- /dev/null
+++ b/core/src/test/resources/HistoryServerExpectations/applications/local-1422981780767/executors/json_expectation
@@ -0,0 +1,17 @@
+[ {
+ "id" : "<driver>",
+ "hostPort" : "localhost:57971",
+ "rddBlocks" : 8,
+ "memoryUsed" : 28000128,
+ "diskUsed" : 0,
+ "activeTasks" : 0,
+ "failedTasks" : 1,
+ "completedTasks" : 31,
+ "totalTasks" : 32,
+ "totalDuration" : 8820,
+ "totalInputBytes" : 28000288,
+ "totalShuffleRead" : 0,
+ "totalShuffleWrite" : 13180,
+ "maxMemory" : 278302556,
+ "executorLogs" : { }
+} ] \ No newline at end of file
diff --git a/core/src/test/resources/HistoryServerExpectations/applications/local-1422981780767/jobs/0/json_expectation b/core/src/test/resources/HistoryServerExpectations/applications/local-1422981780767/jobs/0/json_expectation
new file mode 100644
index 0000000000..4a29072bdb
--- /dev/null
+++ b/core/src/test/resources/HistoryServerExpectations/applications/local-1422981780767/jobs/0/json_expectation
@@ -0,0 +1,15 @@
+{
+ "jobId" : 0,
+ "name" : "count at <console>:15",
+ "stageIds" : [ 0 ],
+ "status" : "SUCCEEDED",
+ "numTasks" : 8,
+ "numActiveTasks" : 0,
+ "numCompletedTasks" : 8,
+ "numSkippedTasks" : 8,
+ "numFailedTasks" : 0,
+ "numActiveStages" : 0,
+ "numCompletedStages" : 1,
+ "numSkippedStages" : 0,
+ "numFailedStages" : 0
+} \ No newline at end of file
diff --git a/core/src/test/resources/HistoryServerExpectations/applications/local-1422981780767/jobs/json_expectation b/core/src/test/resources/HistoryServerExpectations/applications/local-1422981780767/jobs/json_expectation
new file mode 100644
index 0000000000..cab4750270
--- /dev/null
+++ b/core/src/test/resources/HistoryServerExpectations/applications/local-1422981780767/jobs/json_expectation
@@ -0,0 +1,43 @@
+[ {
+ "jobId" : 2,
+ "name" : "count at <console>:17",
+ "stageIds" : [ 3 ],
+ "status" : "SUCCEEDED",
+ "numTasks" : 8,
+ "numActiveTasks" : 0,
+ "numCompletedTasks" : 8,
+ "numSkippedTasks" : 8,
+ "numFailedTasks" : 0,
+ "numActiveStages" : 0,
+ "numCompletedStages" : 1,
+ "numSkippedStages" : 0,
+ "numFailedStages" : 0
+}, {
+ "jobId" : 1,
+ "name" : "count at <console>:20",
+ "stageIds" : [ 1, 2 ],
+ "status" : "FAILED",
+ "numTasks" : 16,
+ "numActiveTasks" : 0,
+ "numCompletedTasks" : 15,
+ "numSkippedTasks" : 15,
+ "numFailedTasks" : 1,
+ "numActiveStages" : 0,
+ "numCompletedStages" : 1,
+ "numSkippedStages" : 0,
+ "numFailedStages" : 1
+}, {
+ "jobId" : 0,
+ "name" : "count at <console>:15",
+ "stageIds" : [ 0 ],
+ "status" : "SUCCEEDED",
+ "numTasks" : 8,
+ "numActiveTasks" : 0,
+ "numCompletedTasks" : 8,
+ "numSkippedTasks" : 8,
+ "numFailedTasks" : 0,
+ "numActiveStages" : 0,
+ "numCompletedStages" : 1,
+ "numSkippedStages" : 0,
+ "numFailedStages" : 0
+} ] \ No newline at end of file
diff --git a/core/src/test/resources/HistoryServerExpectations/applications/local-1422981780767/jobs?status=succeeded&status=failed/json_expectation b/core/src/test/resources/HistoryServerExpectations/applications/local-1422981780767/jobs?status=succeeded&status=failed/json_expectation
new file mode 100644
index 0000000000..cab4750270
--- /dev/null
+++ b/core/src/test/resources/HistoryServerExpectations/applications/local-1422981780767/jobs?status=succeeded&status=failed/json_expectation
@@ -0,0 +1,43 @@
+[ {
+ "jobId" : 2,
+ "name" : "count at <console>:17",
+ "stageIds" : [ 3 ],
+ "status" : "SUCCEEDED",
+ "numTasks" : 8,
+ "numActiveTasks" : 0,
+ "numCompletedTasks" : 8,
+ "numSkippedTasks" : 8,
+ "numFailedTasks" : 0,
+ "numActiveStages" : 0,
+ "numCompletedStages" : 1,
+ "numSkippedStages" : 0,
+ "numFailedStages" : 0
+}, {
+ "jobId" : 1,
+ "name" : "count at <console>:20",
+ "stageIds" : [ 1, 2 ],
+ "status" : "FAILED",
+ "numTasks" : 16,
+ "numActiveTasks" : 0,
+ "numCompletedTasks" : 15,
+ "numSkippedTasks" : 15,
+ "numFailedTasks" : 1,
+ "numActiveStages" : 0,
+ "numCompletedStages" : 1,
+ "numSkippedStages" : 0,
+ "numFailedStages" : 1
+}, {
+ "jobId" : 0,
+ "name" : "count at <console>:15",
+ "stageIds" : [ 0 ],
+ "status" : "SUCCEEDED",
+ "numTasks" : 8,
+ "numActiveTasks" : 0,
+ "numCompletedTasks" : 8,
+ "numSkippedTasks" : 8,
+ "numFailedTasks" : 0,
+ "numActiveStages" : 0,
+ "numCompletedStages" : 1,
+ "numSkippedStages" : 0,
+ "numFailedStages" : 0
+} ] \ No newline at end of file
diff --git a/core/src/test/resources/HistoryServerExpectations/applications/local-1422981780767/jobs?status=succeeded/json_expectation b/core/src/test/resources/HistoryServerExpectations/applications/local-1422981780767/jobs?status=succeeded/json_expectation
new file mode 100644
index 0000000000..6fd25befbf
--- /dev/null
+++ b/core/src/test/resources/HistoryServerExpectations/applications/local-1422981780767/jobs?status=succeeded/json_expectation
@@ -0,0 +1,29 @@
+[ {
+ "jobId" : 2,
+ "name" : "count at <console>:17",
+ "stageIds" : [ 3 ],
+ "status" : "SUCCEEDED",
+ "numTasks" : 8,
+ "numActiveTasks" : 0,
+ "numCompletedTasks" : 8,
+ "numSkippedTasks" : 8,
+ "numFailedTasks" : 0,
+ "numActiveStages" : 0,
+ "numCompletedStages" : 1,
+ "numSkippedStages" : 0,
+ "numFailedStages" : 0
+}, {
+ "jobId" : 0,
+ "name" : "count at <console>:15",
+ "stageIds" : [ 0 ],
+ "status" : "SUCCEEDED",
+ "numTasks" : 8,
+ "numActiveTasks" : 0,
+ "numCompletedTasks" : 8,
+ "numSkippedTasks" : 8,
+ "numFailedTasks" : 0,
+ "numActiveStages" : 0,
+ "numCompletedStages" : 1,
+ "numSkippedStages" : 0,
+ "numFailedStages" : 0
+} ] \ No newline at end of file
diff --git a/core/src/test/resources/HistoryServerExpectations/applications/local-1422981780767/json_expectation b/core/src/test/resources/HistoryServerExpectations/applications/local-1422981780767/json_expectation
new file mode 100644
index 0000000000..07489ad964
--- /dev/null
+++ b/core/src/test/resources/HistoryServerExpectations/applications/local-1422981780767/json_expectation
@@ -0,0 +1,10 @@
+{
+ "id" : "local-1422981780767",
+ "name" : "Spark shell",
+ "attempts" : [ {
+ "startTime" : "2015-02-03T16:42:59.720GMT",
+ "endTime" : "2015-02-03T16:43:08.731GMT",
+ "sparkUser" : "irashid",
+ "completed" : true
+ } ]
+} \ No newline at end of file
diff --git a/core/src/test/resources/HistoryServerExpectations/applications/local-1422981780767/stages/1/0/json_expectation b/core/src/test/resources/HistoryServerExpectations/applications/local-1422981780767/stages/1/0/json_expectation
new file mode 100644
index 0000000000..111cb8163e
--- /dev/null
+++ b/core/src/test/resources/HistoryServerExpectations/applications/local-1422981780767/stages/1/0/json_expectation
@@ -0,0 +1,270 @@
+{
+ "status" : "COMPLETE",
+ "stageId" : 1,
+ "attemptId" : 0,
+ "numActiveTasks" : 0,
+ "numCompleteTasks" : 8,
+ "numFailedTasks" : 0,
+ "executorRunTime" : 3476,
+ "inputBytes" : 28000128,
+ "inputRecords" : 0,
+ "outputBytes" : 0,
+ "outputRecords" : 0,
+ "shuffleReadBytes" : 0,
+ "shuffleReadRecords" : 0,
+ "shuffleWriteBytes" : 13180,
+ "shuffleWriteRecords" : 0,
+ "memoryBytesSpilled" : 0,
+ "diskBytesSpilled" : 0,
+ "name" : "map at <console>:14",
+ "details" : "org.apache.spark.rdd.RDD.map(RDD.scala:271)\n$line10.$read$$iwC$$iwC$$iwC$$iwC.<init>(<console>:14)\n$line10.$read$$iwC$$iwC$$iwC.<init>(<console>:19)\n$line10.$read$$iwC$$iwC.<init>(<console>:21)\n$line10.$read$$iwC.<init>(<console>:23)\n$line10.$read.<init>(<console>:25)\n$line10.$read$.<init>(<console>:29)\n$line10.$read$.<clinit>(<console>)\n$line10.$eval$.<init>(<console>:7)\n$line10.$eval$.<clinit>(<console>)\n$line10.$eval.$print(<console>)\nsun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)\nsun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)\nsun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)\njava.lang.reflect.Method.invoke(Method.java:606)\norg.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:852)\norg.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1125)\norg.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:674)\norg.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:705)\norg.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:669)",
+ "schedulingPool" : "default",
+ "accumulatorUpdates" : [ ],
+ "tasks" : {
+ "8" : {
+ "taskId" : 8,
+ "index" : 0,
+ "attempt" : 0,
+ "launchTime" : "2015-02-03T16:43:05.829GMT",
+ "executorId" : "<driver>",
+ "host" : "localhost",
+ "taskLocality" : "PROCESS_LOCAL",
+ "speculative" : false,
+ "accumulatorUpdates" : [ ],
+ "taskMetrics" : {
+ "executorDeserializeTime" : 1,
+ "executorRunTime" : 435,
+ "resultSize" : 1902,
+ "jvmGcTime" : 19,
+ "resultSerializationTime" : 2,
+ "memoryBytesSpilled" : 0,
+ "diskBytesSpilled" : 0,
+ "inputMetrics" : {
+ "bytesRead" : 3500016,
+ "recordsRead" : 0
+ },
+ "shuffleWriteMetrics" : {
+ "bytesWritten" : 1648,
+ "writeTime" : 94000,
+ "recordsWritten" : 0
+ }
+ }
+ },
+ "11" : {
+ "taskId" : 11,
+ "index" : 3,
+ "attempt" : 0,
+ "launchTime" : "2015-02-03T16:43:05.830GMT",
+ "executorId" : "<driver>",
+ "host" : "localhost",
+ "taskLocality" : "PROCESS_LOCAL",
+ "speculative" : false,
+ "accumulatorUpdates" : [ ],
+ "taskMetrics" : {
+ "executorDeserializeTime" : 2,
+ "executorRunTime" : 434,
+ "resultSize" : 1902,
+ "jvmGcTime" : 19,
+ "resultSerializationTime" : 1,
+ "memoryBytesSpilled" : 0,
+ "diskBytesSpilled" : 0,
+ "inputMetrics" : {
+ "bytesRead" : 3500016,
+ "recordsRead" : 0
+ },
+ "shuffleWriteMetrics" : {
+ "bytesWritten" : 1647,
+ "writeTime" : 83000,
+ "recordsWritten" : 0
+ }
+ }
+ },
+ "14" : {
+ "taskId" : 14,
+ "index" : 6,
+ "attempt" : 0,
+ "launchTime" : "2015-02-03T16:43:05.832GMT",
+ "executorId" : "<driver>",
+ "host" : "localhost",
+ "taskLocality" : "PROCESS_LOCAL",
+ "speculative" : false,
+ "accumulatorUpdates" : [ ],
+ "taskMetrics" : {
+ "executorDeserializeTime" : 2,
+ "executorRunTime" : 434,
+ "resultSize" : 1902,
+ "jvmGcTime" : 19,
+ "resultSerializationTime" : 1,
+ "memoryBytesSpilled" : 0,
+ "diskBytesSpilled" : 0,
+ "inputMetrics" : {
+ "bytesRead" : 3500016,
+ "recordsRead" : 0
+ },
+ "shuffleWriteMetrics" : {
+ "bytesWritten" : 1648,
+ "writeTime" : 88000,
+ "recordsWritten" : 0
+ }
+ }
+ },
+ "13" : {
+ "taskId" : 13,
+ "index" : 5,
+ "attempt" : 0,
+ "launchTime" : "2015-02-03T16:43:05.831GMT",
+ "executorId" : "<driver>",
+ "host" : "localhost",
+ "taskLocality" : "PROCESS_LOCAL",
+ "speculative" : false,
+ "accumulatorUpdates" : [ ],
+ "taskMetrics" : {
+ "executorDeserializeTime" : 2,
+ "executorRunTime" : 434,
+ "resultSize" : 1902,
+ "jvmGcTime" : 19,
+ "resultSerializationTime" : 2,
+ "memoryBytesSpilled" : 0,
+ "diskBytesSpilled" : 0,
+ "inputMetrics" : {
+ "bytesRead" : 3500016,
+ "recordsRead" : 0
+ },
+ "shuffleWriteMetrics" : {
+ "bytesWritten" : 1648,
+ "writeTime" : 73000,
+ "recordsWritten" : 0
+ }
+ }
+ },
+ "10" : {
+ "taskId" : 10,
+ "index" : 2,
+ "attempt" : 0,
+ "launchTime" : "2015-02-03T16:43:05.830GMT",
+ "executorId" : "<driver>",
+ "host" : "localhost",
+ "taskLocality" : "PROCESS_LOCAL",
+ "speculative" : false,
+ "accumulatorUpdates" : [ ],
+ "taskMetrics" : {
+ "executorDeserializeTime" : 2,
+ "executorRunTime" : 434,
+ "resultSize" : 1902,
+ "jvmGcTime" : 19,
+ "resultSerializationTime" : 1,
+ "memoryBytesSpilled" : 0,
+ "diskBytesSpilled" : 0,
+ "inputMetrics" : {
+ "bytesRead" : 3500016,
+ "recordsRead" : 0
+ },
+ "shuffleWriteMetrics" : {
+ "bytesWritten" : 1648,
+ "writeTime" : 76000,
+ "recordsWritten" : 0
+ }
+ }
+ },
+ "9" : {
+ "taskId" : 9,
+ "index" : 1,
+ "attempt" : 0,
+ "launchTime" : "2015-02-03T16:43:05.830GMT",
+ "executorId" : "<driver>",
+ "host" : "localhost",
+ "taskLocality" : "PROCESS_LOCAL",
+ "speculative" : false,
+ "accumulatorUpdates" : [ ],
+ "taskMetrics" : {
+ "executorDeserializeTime" : 1,
+ "executorRunTime" : 436,
+ "resultSize" : 1902,
+ "jvmGcTime" : 19,
+ "resultSerializationTime" : 0,
+ "memoryBytesSpilled" : 0,
+ "diskBytesSpilled" : 0,
+ "inputMetrics" : {
+ "bytesRead" : 3500016,
+ "recordsRead" : 0
+ },
+ "shuffleWriteMetrics" : {
+ "bytesWritten" : 1648,
+ "writeTime" : 98000,
+ "recordsWritten" : 0
+ }
+ }
+ },
+ "12" : {
+ "taskId" : 12,
+ "index" : 4,
+ "attempt" : 0,
+ "launchTime" : "2015-02-03T16:43:05.831GMT",
+ "executorId" : "<driver>",
+ "host" : "localhost",
+ "taskLocality" : "PROCESS_LOCAL",
+ "speculative" : false,
+ "accumulatorUpdates" : [ ],
+ "taskMetrics" : {
+ "executorDeserializeTime" : 2,
+ "executorRunTime" : 434,
+ "resultSize" : 1902,
+ "jvmGcTime" : 19,
+ "resultSerializationTime" : 1,
+ "memoryBytesSpilled" : 0,
+ "diskBytesSpilled" : 0,
+ "inputMetrics" : {
+ "bytesRead" : 3500016,
+ "recordsRead" : 0
+ },
+ "shuffleWriteMetrics" : {
+ "bytesWritten" : 1645,
+ "writeTime" : 101000,
+ "recordsWritten" : 0
+ }
+ }
+ },
+ "15" : {
+ "taskId" : 15,
+ "index" : 7,
+ "attempt" : 0,
+ "launchTime" : "2015-02-03T16:43:05.833GMT",
+ "executorId" : "<driver>",
+ "host" : "localhost",
+ "taskLocality" : "PROCESS_LOCAL",
+ "speculative" : false,
+ "accumulatorUpdates" : [ ],
+ "taskMetrics" : {
+ "executorDeserializeTime" : 1,
+ "executorRunTime" : 435,
+ "resultSize" : 1902,
+ "jvmGcTime" : 19,
+ "resultSerializationTime" : 1,
+ "memoryBytesSpilled" : 0,
+ "diskBytesSpilled" : 0,
+ "inputMetrics" : {
+ "bytesRead" : 3500016,
+ "recordsRead" : 0
+ },
+ "shuffleWriteMetrics" : {
+ "bytesWritten" : 1648,
+ "writeTime" : 79000,
+ "recordsWritten" : 0
+ }
+ }
+ }
+ },
+ "executorSummary" : {
+ "<driver>" : {
+ "taskTime" : 3624,
+ "failedTasks" : 0,
+ "succeededTasks" : 8,
+ "inputBytes" : 28000128,
+ "outputBytes" : 0,
+ "shuffleRead" : 0,
+ "shuffleWrite" : 13180,
+ "memoryBytesSpilled" : 0,
+ "diskBytesSpilled" : 0
+ }
+ }
+} \ No newline at end of file
diff --git a/core/src/test/resources/HistoryServerExpectations/applications/local-1422981780767/stages/1/json_expectation b/core/src/test/resources/HistoryServerExpectations/applications/local-1422981780767/stages/1/json_expectation
new file mode 100644
index 0000000000..ef339f89af
--- /dev/null
+++ b/core/src/test/resources/HistoryServerExpectations/applications/local-1422981780767/stages/1/json_expectation
@@ -0,0 +1,270 @@
+[ {
+ "status" : "COMPLETE",
+ "stageId" : 1,
+ "attemptId" : 0,
+ "numActiveTasks" : 0,
+ "numCompleteTasks" : 8,
+ "numFailedTasks" : 0,
+ "executorRunTime" : 3476,
+ "inputBytes" : 28000128,
+ "inputRecords" : 0,
+ "outputBytes" : 0,
+ "outputRecords" : 0,
+ "shuffleReadBytes" : 0,
+ "shuffleReadRecords" : 0,
+ "shuffleWriteBytes" : 13180,
+ "shuffleWriteRecords" : 0,
+ "memoryBytesSpilled" : 0,
+ "diskBytesSpilled" : 0,
+ "name" : "map at <console>:14",
+ "details" : "org.apache.spark.rdd.RDD.map(RDD.scala:271)\n$line10.$read$$iwC$$iwC$$iwC$$iwC.<init>(<console>:14)\n$line10.$read$$iwC$$iwC$$iwC.<init>(<console>:19)\n$line10.$read$$iwC$$iwC.<init>(<console>:21)\n$line10.$read$$iwC.<init>(<console>:23)\n$line10.$read.<init>(<console>:25)\n$line10.$read$.<init>(<console>:29)\n$line10.$read$.<clinit>(<console>)\n$line10.$eval$.<init>(<console>:7)\n$line10.$eval$.<clinit>(<console>)\n$line10.$eval.$print(<console>)\nsun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)\nsun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)\nsun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)\njava.lang.reflect.Method.invoke(Method.java:606)\norg.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:852)\norg.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1125)\norg.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:674)\norg.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:705)\norg.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:669)",
+ "schedulingPool" : "default",
+ "accumulatorUpdates" : [ ],
+ "tasks" : {
+ "8" : {
+ "taskId" : 8,
+ "index" : 0,
+ "attempt" : 0,
+ "launchTime" : "2015-02-03T16:43:05.829GMT",
+ "executorId" : "<driver>",
+ "host" : "localhost",
+ "taskLocality" : "PROCESS_LOCAL",
+ "speculative" : false,
+ "accumulatorUpdates" : [ ],
+ "taskMetrics" : {
+ "executorDeserializeTime" : 1,
+ "executorRunTime" : 435,
+ "resultSize" : 1902,
+ "jvmGcTime" : 19,
+ "resultSerializationTime" : 2,
+ "memoryBytesSpilled" : 0,
+ "diskBytesSpilled" : 0,
+ "inputMetrics" : {
+ "bytesRead" : 3500016,
+ "recordsRead" : 0
+ },
+ "shuffleWriteMetrics" : {
+ "bytesWritten" : 1648,
+ "writeTime" : 94000,
+ "recordsWritten" : 0
+ }
+ }
+ },
+ "11" : {
+ "taskId" : 11,
+ "index" : 3,
+ "attempt" : 0,
+ "launchTime" : "2015-02-03T16:43:05.830GMT",
+ "executorId" : "<driver>",
+ "host" : "localhost",
+ "taskLocality" : "PROCESS_LOCAL",
+ "speculative" : false,
+ "accumulatorUpdates" : [ ],
+ "taskMetrics" : {
+ "executorDeserializeTime" : 2,
+ "executorRunTime" : 434,
+ "resultSize" : 1902,
+ "jvmGcTime" : 19,
+ "resultSerializationTime" : 1,
+ "memoryBytesSpilled" : 0,
+ "diskBytesSpilled" : 0,
+ "inputMetrics" : {
+ "bytesRead" : 3500016,
+ "recordsRead" : 0
+ },
+ "shuffleWriteMetrics" : {
+ "bytesWritten" : 1647,
+ "writeTime" : 83000,
+ "recordsWritten" : 0
+ }
+ }
+ },
+ "14" : {
+ "taskId" : 14,
+ "index" : 6,
+ "attempt" : 0,
+ "launchTime" : "2015-02-03T16:43:05.832GMT",
+ "executorId" : "<driver>",
+ "host" : "localhost",
+ "taskLocality" : "PROCESS_LOCAL",
+ "speculative" : false,
+ "accumulatorUpdates" : [ ],
+ "taskMetrics" : {
+ "executorDeserializeTime" : 2,
+ "executorRunTime" : 434,
+ "resultSize" : 1902,
+ "jvmGcTime" : 19,
+ "resultSerializationTime" : 1,
+ "memoryBytesSpilled" : 0,
+ "diskBytesSpilled" : 0,
+ "inputMetrics" : {
+ "bytesRead" : 3500016,
+ "recordsRead" : 0
+ },
+ "shuffleWriteMetrics" : {
+ "bytesWritten" : 1648,
+ "writeTime" : 88000,
+ "recordsWritten" : 0
+ }
+ }
+ },
+ "13" : {
+ "taskId" : 13,
+ "index" : 5,
+ "attempt" : 0,
+ "launchTime" : "2015-02-03T16:43:05.831GMT",
+ "executorId" : "<driver>",
+ "host" : "localhost",
+ "taskLocality" : "PROCESS_LOCAL",
+ "speculative" : false,
+ "accumulatorUpdates" : [ ],
+ "taskMetrics" : {
+ "executorDeserializeTime" : 2,
+ "executorRunTime" : 434,
+ "resultSize" : 1902,
+ "jvmGcTime" : 19,
+ "resultSerializationTime" : 2,
+ "memoryBytesSpilled" : 0,
+ "diskBytesSpilled" : 0,
+ "inputMetrics" : {
+ "bytesRead" : 3500016,
+ "recordsRead" : 0
+ },
+ "shuffleWriteMetrics" : {
+ "bytesWritten" : 1648,
+ "writeTime" : 73000,
+ "recordsWritten" : 0
+ }
+ }
+ },
+ "10" : {
+ "taskId" : 10,
+ "index" : 2,
+ "attempt" : 0,
+ "launchTime" : "2015-02-03T16:43:05.830GMT",
+ "executorId" : "<driver>",
+ "host" : "localhost",
+ "taskLocality" : "PROCESS_LOCAL",
+ "speculative" : false,
+ "accumulatorUpdates" : [ ],
+ "taskMetrics" : {
+ "executorDeserializeTime" : 2,
+ "executorRunTime" : 434,
+ "resultSize" : 1902,
+ "jvmGcTime" : 19,
+ "resultSerializationTime" : 1,
+ "memoryBytesSpilled" : 0,
+ "diskBytesSpilled" : 0,
+ "inputMetrics" : {
+ "bytesRead" : 3500016,
+ "recordsRead" : 0
+ },
+ "shuffleWriteMetrics" : {
+ "bytesWritten" : 1648,
+ "writeTime" : 76000,
+ "recordsWritten" : 0
+ }
+ }
+ },
+ "9" : {
+ "taskId" : 9,
+ "index" : 1,
+ "attempt" : 0,
+ "launchTime" : "2015-02-03T16:43:05.830GMT",
+ "executorId" : "<driver>",
+ "host" : "localhost",
+ "taskLocality" : "PROCESS_LOCAL",
+ "speculative" : false,
+ "accumulatorUpdates" : [ ],
+ "taskMetrics" : {
+ "executorDeserializeTime" : 1,
+ "executorRunTime" : 436,
+ "resultSize" : 1902,
+ "jvmGcTime" : 19,
+ "resultSerializationTime" : 0,
+ "memoryBytesSpilled" : 0,
+ "diskBytesSpilled" : 0,
+ "inputMetrics" : {
+ "bytesRead" : 3500016,
+ "recordsRead" : 0
+ },
+ "shuffleWriteMetrics" : {
+ "bytesWritten" : 1648,
+ "writeTime" : 98000,
+ "recordsWritten" : 0
+ }
+ }
+ },
+ "12" : {
+ "taskId" : 12,
+ "index" : 4,
+ "attempt" : 0,
+ "launchTime" : "2015-02-03T16:43:05.831GMT",
+ "executorId" : "<driver>",
+ "host" : "localhost",
+ "taskLocality" : "PROCESS_LOCAL",
+ "speculative" : false,
+ "accumulatorUpdates" : [ ],
+ "taskMetrics" : {
+ "executorDeserializeTime" : 2,
+ "executorRunTime" : 434,
+ "resultSize" : 1902,
+ "jvmGcTime" : 19,
+ "resultSerializationTime" : 1,
+ "memoryBytesSpilled" : 0,
+ "diskBytesSpilled" : 0,
+ "inputMetrics" : {
+ "bytesRead" : 3500016,
+ "recordsRead" : 0
+ },
+ "shuffleWriteMetrics" : {
+ "bytesWritten" : 1645,
+ "writeTime" : 101000,
+ "recordsWritten" : 0
+ }
+ }
+ },
+ "15" : {
+ "taskId" : 15,
+ "index" : 7,
+ "attempt" : 0,
+ "launchTime" : "2015-02-03T16:43:05.833GMT",
+ "executorId" : "<driver>",
+ "host" : "localhost",
+ "taskLocality" : "PROCESS_LOCAL",
+ "speculative" : false,
+ "accumulatorUpdates" : [ ],
+ "taskMetrics" : {
+ "executorDeserializeTime" : 1,
+ "executorRunTime" : 435,
+ "resultSize" : 1902,
+ "jvmGcTime" : 19,
+ "resultSerializationTime" : 1,
+ "memoryBytesSpilled" : 0,
+ "diskBytesSpilled" : 0,
+ "inputMetrics" : {
+ "bytesRead" : 3500016,
+ "recordsRead" : 0
+ },
+ "shuffleWriteMetrics" : {
+ "bytesWritten" : 1648,
+ "writeTime" : 79000,
+ "recordsWritten" : 0
+ }
+ }
+ }
+ },
+ "executorSummary" : {
+ "<driver>" : {
+ "taskTime" : 3624,
+ "failedTasks" : 0,
+ "succeededTasks" : 8,
+ "inputBytes" : 28000128,
+ "outputBytes" : 0,
+ "shuffleRead" : 0,
+ "shuffleWrite" : 13180,
+ "memoryBytesSpilled" : 0,
+ "diskBytesSpilled" : 0
+ }
+ }
+} ] \ No newline at end of file
diff --git a/core/src/test/resources/HistoryServerExpectations/applications/local-1422981780767/stages/json_expectation b/core/src/test/resources/HistoryServerExpectations/applications/local-1422981780767/stages/json_expectation
new file mode 100644
index 0000000000..056fac7088
--- /dev/null
+++ b/core/src/test/resources/HistoryServerExpectations/applications/local-1422981780767/stages/json_expectation
@@ -0,0 +1,89 @@
+[ {
+ "status" : "COMPLETE",
+ "stageId" : 3,
+ "attemptId" : 0,
+ "numActiveTasks" : 0,
+ "numCompleteTasks" : 8,
+ "numFailedTasks" : 0,
+ "executorRunTime" : 162,
+ "inputBytes" : 160,
+ "inputRecords" : 0,
+ "outputBytes" : 0,
+ "outputRecords" : 0,
+ "shuffleReadBytes" : 0,
+ "shuffleReadRecords" : 0,
+ "shuffleWriteBytes" : 0,
+ "shuffleWriteRecords" : 0,
+ "memoryBytesSpilled" : 0,
+ "diskBytesSpilled" : 0,
+ "name" : "count at <console>:17",
+ "details" : "org.apache.spark.rdd.RDD.count(RDD.scala:910)\n$line19.$read$$iwC$$iwC$$iwC$$iwC.<init>(<console>:17)\n$line19.$read$$iwC$$iwC$$iwC.<init>(<console>:22)\n$line19.$read$$iwC$$iwC.<init>(<console>:24)\n$line19.$read$$iwC.<init>(<console>:26)\n$line19.$read.<init>(<console>:28)\n$line19.$read$.<init>(<console>:32)\n$line19.$read$.<clinit>(<console>)\n$line19.$eval$.<init>(<console>:7)\n$line19.$eval$.<clinit>(<console>)\n$line19.$eval.$print(<console>)\nsun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)\nsun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)\nsun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)\njava.lang.reflect.Method.invoke(Method.java:606)\norg.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:852)\norg.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1125)\norg.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:674)\norg.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:705)\norg.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:669)",
+ "schedulingPool" : "default",
+ "accumulatorUpdates" : [ ]
+}, {
+ "status" : "COMPLETE",
+ "stageId" : 1,
+ "attemptId" : 0,
+ "numActiveTasks" : 0,
+ "numCompleteTasks" : 8,
+ "numFailedTasks" : 0,
+ "executorRunTime" : 3476,
+ "inputBytes" : 28000128,
+ "inputRecords" : 0,
+ "outputBytes" : 0,
+ "outputRecords" : 0,
+ "shuffleReadBytes" : 0,
+ "shuffleReadRecords" : 0,
+ "shuffleWriteBytes" : 13180,
+ "shuffleWriteRecords" : 0,
+ "memoryBytesSpilled" : 0,
+ "diskBytesSpilled" : 0,
+ "name" : "map at <console>:14",
+ "details" : "org.apache.spark.rdd.RDD.map(RDD.scala:271)\n$line10.$read$$iwC$$iwC$$iwC$$iwC.<init>(<console>:14)\n$line10.$read$$iwC$$iwC$$iwC.<init>(<console>:19)\n$line10.$read$$iwC$$iwC.<init>(<console>:21)\n$line10.$read$$iwC.<init>(<console>:23)\n$line10.$read.<init>(<console>:25)\n$line10.$read$.<init>(<console>:29)\n$line10.$read$.<clinit>(<console>)\n$line10.$eval$.<init>(<console>:7)\n$line10.$eval$.<clinit>(<console>)\n$line10.$eval.$print(<console>)\nsun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)\nsun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)\nsun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)\njava.lang.reflect.Method.invoke(Method.java:606)\norg.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:852)\norg.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1125)\norg.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:674)\norg.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:705)\norg.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:669)",
+ "schedulingPool" : "default",
+ "accumulatorUpdates" : [ ]
+}, {
+ "status" : "COMPLETE",
+ "stageId" : 0,
+ "attemptId" : 0,
+ "numActiveTasks" : 0,
+ "numCompleteTasks" : 8,
+ "numFailedTasks" : 0,
+ "executorRunTime" : 4338,
+ "inputBytes" : 0,
+ "inputRecords" : 0,
+ "outputBytes" : 0,
+ "outputRecords" : 0,
+ "shuffleReadBytes" : 0,
+ "shuffleReadRecords" : 0,
+ "shuffleWriteBytes" : 0,
+ "shuffleWriteRecords" : 0,
+ "memoryBytesSpilled" : 0,
+ "diskBytesSpilled" : 0,
+ "name" : "count at <console>:15",
+ "details" : "org.apache.spark.rdd.RDD.count(RDD.scala:910)\n$line9.$read$$iwC$$iwC$$iwC$$iwC.<init>(<console>:15)\n$line9.$read$$iwC$$iwC$$iwC.<init>(<console>:20)\n$line9.$read$$iwC$$iwC.<init>(<console>:22)\n$line9.$read$$iwC.<init>(<console>:24)\n$line9.$read.<init>(<console>:26)\n$line9.$read$.<init>(<console>:30)\n$line9.$read$.<clinit>(<console>)\n$line9.$eval$.<init>(<console>:7)\n$line9.$eval$.<clinit>(<console>)\n$line9.$eval.$print(<console>)\nsun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)\nsun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)\nsun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)\njava.lang.reflect.Method.invoke(Method.java:606)\norg.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:852)\norg.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1125)\norg.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:674)\norg.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:705)\norg.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:669)",
+ "schedulingPool" : "default",
+ "accumulatorUpdates" : [ ]
+}, {
+ "status" : "FAILED",
+ "stageId" : 2,
+ "attemptId" : 0,
+ "numActiveTasks" : 0,
+ "numCompleteTasks" : 7,
+ "numFailedTasks" : 1,
+ "executorRunTime" : 278,
+ "inputBytes" : 0,
+ "inputRecords" : 0,
+ "outputBytes" : 0,
+ "outputRecords" : 0,
+ "shuffleReadBytes" : 0,
+ "shuffleReadRecords" : 0,
+ "shuffleWriteBytes" : 0,
+ "shuffleWriteRecords" : 0,
+ "memoryBytesSpilled" : 0,
+ "diskBytesSpilled" : 0,
+ "name" : "count at <console>:20",
+ "details" : "org.apache.spark.rdd.RDD.count(RDD.scala:910)\n$line11.$read$$iwC$$iwC$$iwC$$iwC.<init>(<console>:20)\n$line11.$read$$iwC$$iwC$$iwC.<init>(<console>:25)\n$line11.$read$$iwC$$iwC.<init>(<console>:27)\n$line11.$read$$iwC.<init>(<console>:29)\n$line11.$read.<init>(<console>:31)\n$line11.$read$.<init>(<console>:35)\n$line11.$read$.<clinit>(<console>)\n$line11.$eval$.<init>(<console>:7)\n$line11.$eval$.<clinit>(<console>)\n$line11.$eval.$print(<console>)\nsun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)\nsun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)\nsun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)\njava.lang.reflect.Method.invoke(Method.java:606)\norg.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:852)\norg.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1125)\norg.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:674)\norg.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:705)\norg.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:669)",
+ "schedulingPool" : "default",
+ "accumulatorUpdates" : [ ]
+} ] \ No newline at end of file
diff --git a/core/src/test/resources/HistoryServerExpectations/applications/local-1422981780767/stages?status=complete/json_expectation b/core/src/test/resources/HistoryServerExpectations/applications/local-1422981780767/stages?status=complete/json_expectation
new file mode 100644
index 0000000000..31ac9beea8
--- /dev/null
+++ b/core/src/test/resources/HistoryServerExpectations/applications/local-1422981780767/stages?status=complete/json_expectation
@@ -0,0 +1,67 @@
+[ {
+ "status" : "COMPLETE",
+ "stageId" : 3,
+ "attemptId" : 0,
+ "numActiveTasks" : 0,
+ "numCompleteTasks" : 8,
+ "numFailedTasks" : 0,
+ "executorRunTime" : 162,
+ "inputBytes" : 160,
+ "inputRecords" : 0,
+ "outputBytes" : 0,
+ "outputRecords" : 0,
+ "shuffleReadBytes" : 0,
+ "shuffleReadRecords" : 0,
+ "shuffleWriteBytes" : 0,
+ "shuffleWriteRecords" : 0,
+ "memoryBytesSpilled" : 0,
+ "diskBytesSpilled" : 0,
+ "name" : "count at <console>:17",
+ "details" : "org.apache.spark.rdd.RDD.count(RDD.scala:910)\n$line19.$read$$iwC$$iwC$$iwC$$iwC.<init>(<console>:17)\n$line19.$read$$iwC$$iwC$$iwC.<init>(<console>:22)\n$line19.$read$$iwC$$iwC.<init>(<console>:24)\n$line19.$read$$iwC.<init>(<console>:26)\n$line19.$read.<init>(<console>:28)\n$line19.$read$.<init>(<console>:32)\n$line19.$read$.<clinit>(<console>)\n$line19.$eval$.<init>(<console>:7)\n$line19.$eval$.<clinit>(<console>)\n$line19.$eval.$print(<console>)\nsun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)\nsun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)\nsun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)\njava.lang.reflect.Method.invoke(Method.java:606)\norg.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:852)\norg.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1125)\norg.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:674)\norg.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:705)\norg.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:669)",
+ "schedulingPool" : "default",
+ "accumulatorUpdates" : [ ]
+}, {
+ "status" : "COMPLETE",
+ "stageId" : 1,
+ "attemptId" : 0,
+ "numActiveTasks" : 0,
+ "numCompleteTasks" : 8,
+ "numFailedTasks" : 0,
+ "executorRunTime" : 3476,
+ "inputBytes" : 28000128,
+ "inputRecords" : 0,
+ "outputBytes" : 0,
+ "outputRecords" : 0,
+ "shuffleReadBytes" : 0,
+ "shuffleReadRecords" : 0,
+ "shuffleWriteBytes" : 13180,
+ "shuffleWriteRecords" : 0,
+ "memoryBytesSpilled" : 0,
+ "diskBytesSpilled" : 0,
+ "name" : "map at <console>:14",
+ "details" : "org.apache.spark.rdd.RDD.map(RDD.scala:271)\n$line10.$read$$iwC$$iwC$$iwC$$iwC.<init>(<console>:14)\n$line10.$read$$iwC$$iwC$$iwC.<init>(<console>:19)\n$line10.$read$$iwC$$iwC.<init>(<console>:21)\n$line10.$read$$iwC.<init>(<console>:23)\n$line10.$read.<init>(<console>:25)\n$line10.$read$.<init>(<console>:29)\n$line10.$read$.<clinit>(<console>)\n$line10.$eval$.<init>(<console>:7)\n$line10.$eval$.<clinit>(<console>)\n$line10.$eval.$print(<console>)\nsun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)\nsun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)\nsun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)\njava.lang.reflect.Method.invoke(Method.java:606)\norg.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:852)\norg.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1125)\norg.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:674)\norg.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:705)\norg.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:669)",
+ "schedulingPool" : "default",
+ "accumulatorUpdates" : [ ]
+}, {
+ "status" : "COMPLETE",
+ "stageId" : 0,
+ "attemptId" : 0,
+ "numActiveTasks" : 0,
+ "numCompleteTasks" : 8,
+ "numFailedTasks" : 0,
+ "executorRunTime" : 4338,
+ "inputBytes" : 0,
+ "inputRecords" : 0,
+ "outputBytes" : 0,
+ "outputRecords" : 0,
+ "shuffleReadBytes" : 0,
+ "shuffleReadRecords" : 0,
+ "shuffleWriteBytes" : 0,
+ "shuffleWriteRecords" : 0,
+ "memoryBytesSpilled" : 0,
+ "diskBytesSpilled" : 0,
+ "name" : "count at <console>:15",
+ "details" : "org.apache.spark.rdd.RDD.count(RDD.scala:910)\n$line9.$read$$iwC$$iwC$$iwC$$iwC.<init>(<console>:15)\n$line9.$read$$iwC$$iwC$$iwC.<init>(<console>:20)\n$line9.$read$$iwC$$iwC.<init>(<console>:22)\n$line9.$read$$iwC.<init>(<console>:24)\n$line9.$read.<init>(<console>:26)\n$line9.$read$.<init>(<console>:30)\n$line9.$read$.<clinit>(<console>)\n$line9.$eval$.<init>(<console>:7)\n$line9.$eval$.<clinit>(<console>)\n$line9.$eval.$print(<console>)\nsun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)\nsun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)\nsun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)\njava.lang.reflect.Method.invoke(Method.java:606)\norg.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:852)\norg.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1125)\norg.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:674)\norg.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:705)\norg.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:669)",
+ "schedulingPool" : "default",
+ "accumulatorUpdates" : [ ]
+} ] \ No newline at end of file
diff --git a/core/src/test/resources/HistoryServerExpectations/applications/local-1422981780767/stages?status=failed/json_expectation b/core/src/test/resources/HistoryServerExpectations/applications/local-1422981780767/stages?status=failed/json_expectation
new file mode 100644
index 0000000000..bff6a4f69d
--- /dev/null
+++ b/core/src/test/resources/HistoryServerExpectations/applications/local-1422981780767/stages?status=failed/json_expectation
@@ -0,0 +1,23 @@
+[ {
+ "status" : "FAILED",
+ "stageId" : 2,
+ "attemptId" : 0,
+ "numActiveTasks" : 0,
+ "numCompleteTasks" : 7,
+ "numFailedTasks" : 1,
+ "executorRunTime" : 278,
+ "inputBytes" : 0,
+ "inputRecords" : 0,
+ "outputBytes" : 0,
+ "outputRecords" : 0,
+ "shuffleReadBytes" : 0,
+ "shuffleReadRecords" : 0,
+ "shuffleWriteBytes" : 0,
+ "shuffleWriteRecords" : 0,
+ "memoryBytesSpilled" : 0,
+ "diskBytesSpilled" : 0,
+ "name" : "count at <console>:20",
+ "details" : "org.apache.spark.rdd.RDD.count(RDD.scala:910)\n$line11.$read$$iwC$$iwC$$iwC$$iwC.<init>(<console>:20)\n$line11.$read$$iwC$$iwC$$iwC.<init>(<console>:25)\n$line11.$read$$iwC$$iwC.<init>(<console>:27)\n$line11.$read$$iwC.<init>(<console>:29)\n$line11.$read.<init>(<console>:31)\n$line11.$read$.<init>(<console>:35)\n$line11.$read$.<clinit>(<console>)\n$line11.$eval$.<init>(<console>:7)\n$line11.$eval$.<clinit>(<console>)\n$line11.$eval.$print(<console>)\nsun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)\nsun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)\nsun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)\njava.lang.reflect.Method.invoke(Method.java:606)\norg.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:852)\norg.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1125)\norg.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:674)\norg.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:705)\norg.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:669)",
+ "schedulingPool" : "default",
+ "accumulatorUpdates" : [ ]
+} ] \ No newline at end of file
diff --git a/core/src/test/resources/HistoryServerExpectations/applications/local-1422981780767/storage/rdd/0/json_expectation b/core/src/test/resources/HistoryServerExpectations/applications/local-1422981780767/storage/rdd/0/json_expectation
new file mode 100644
index 0000000000..38b5328ffb
--- /dev/null
+++ b/core/src/test/resources/HistoryServerExpectations/applications/local-1422981780767/storage/rdd/0/json_expectation
@@ -0,0 +1,64 @@
+{
+ "id" : 0,
+ "name" : "0",
+ "numPartitions" : 8,
+ "numCachedPartitions" : 8,
+ "storageLevel" : "Memory Deserialized 1x Replicated",
+ "memoryUsed" : 28000128,
+ "diskUsed" : 0,
+ "dataDistribution" : [ {
+ "address" : "localhost:57971",
+ "memoryUsed" : 28000128,
+ "memoryRemaining" : 250302428,
+ "diskUsed" : 0
+ } ],
+ "partitions" : [ {
+ "blockName" : "rdd_0_0",
+ "storageLevel" : "Memory Deserialized 1x Replicated",
+ "memoryUsed" : 3500016,
+ "diskUsed" : 0,
+ "executors" : [ "localhost:57971" ]
+ }, {
+ "blockName" : "rdd_0_1",
+ "storageLevel" : "Memory Deserialized 1x Replicated",
+ "memoryUsed" : 3500016,
+ "diskUsed" : 0,
+ "executors" : [ "localhost:57971" ]
+ }, {
+ "blockName" : "rdd_0_2",
+ "storageLevel" : "Memory Deserialized 1x Replicated",
+ "memoryUsed" : 3500016,
+ "diskUsed" : 0,
+ "executors" : [ "localhost:57971" ]
+ }, {
+ "blockName" : "rdd_0_3",
+ "storageLevel" : "Memory Deserialized 1x Replicated",
+ "memoryUsed" : 3500016,
+ "diskUsed" : 0,
+ "executors" : [ "localhost:57971" ]
+ }, {
+ "blockName" : "rdd_0_4",
+ "storageLevel" : "Memory Deserialized 1x Replicated",
+ "memoryUsed" : 3500016,
+ "diskUsed" : 0,
+ "executors" : [ "localhost:57971" ]
+ }, {
+ "blockName" : "rdd_0_5",
+ "storageLevel" : "Memory Deserialized 1x Replicated",
+ "memoryUsed" : 3500016,
+ "diskUsed" : 0,
+ "executors" : [ "localhost:57971" ]
+ }, {
+ "blockName" : "rdd_0_6",
+ "storageLevel" : "Memory Deserialized 1x Replicated",
+ "memoryUsed" : 3500016,
+ "diskUsed" : 0,
+ "executors" : [ "localhost:57971" ]
+ }, {
+ "blockName" : "rdd_0_7",
+ "storageLevel" : "Memory Deserialized 1x Replicated",
+ "memoryUsed" : 3500016,
+ "diskUsed" : 0,
+ "executors" : [ "localhost:57971" ]
+ } ]
+} \ No newline at end of file
diff --git a/core/src/test/resources/HistoryServerExpectations/applications/local-1422981780767/storage/rdd/json_expectation b/core/src/test/resources/HistoryServerExpectations/applications/local-1422981780767/storage/rdd/json_expectation
new file mode 100644
index 0000000000..f79a31022d
--- /dev/null
+++ b/core/src/test/resources/HistoryServerExpectations/applications/local-1422981780767/storage/rdd/json_expectation
@@ -0,0 +1,9 @@
+[ {
+ "id" : 0,
+ "name" : "0",
+ "numPartitions" : 8,
+ "numCachedPartitions" : 8,
+ "storageLevel" : "Memory Deserialized 1x Replicated",
+ "memoryUsed" : 28000128,
+ "diskUsed" : 0
+} ] \ No newline at end of file