diff options
author | Shuai Lin <linshuai2012@gmail.com> | 2016-10-26 14:31:47 +0200 |
---|---|---|
committer | Sean Owen <sowen@cloudera.com> | 2016-10-26 14:31:47 +0200 |
commit | 402205ddf749e7478683ce1b0443df63b46b03fd (patch) | |
tree | a01ca4670c7ea607767650b831cc0c50d86f8784 /core/src/test/scala | |
parent | 5d0f81da49e86ee93ecf679a20d024ea2cb8b3d3 (diff) | |
download | spark-402205ddf749e7478683ce1b0443df63b46b03fd.tar.gz spark-402205ddf749e7478683ce1b0443df63b46b03fd.tar.bz2 spark-402205ddf749e7478683ce1b0443df63b46b03fd.zip |
[SPARK-17802] Improved caller context logging.
## What changes were proposed in this pull request?
[SPARK-16757](https://issues.apache.org/jira/browse/SPARK-16757) sets the hadoop `CallerContext` when calling hadoop/hdfs apis to make spark applications more diagnosable in hadoop/hdfs logs. However, the `org.apache.hadoop.ipc.CallerContext` class is only added since [hadoop 2.8](https://issues.apache.org/jira/browse/HDFS-9184), which is not officially releaed yet. So each time `utils.CallerContext.setCurrentContext()` is called (e.g [when a task is created](https://github.com/apache/spark/blob/b678e46/core/src/main/scala/org/apache/spark/scheduler/Task.scala#L95-L96)), a "java.lang.ClassNotFoundException: org.apache.hadoop.ipc.CallerContext"
error is logged, which pollutes the spark logs when there are lots of tasks.
This patch improves this behaviour by only logging the `ClassNotFoundException` once.
## How was this patch tested?
Existing tests.
Author: Shuai Lin <linshuai2012@gmail.com>
Closes #15377 from lins05/spark-17802-improve-callercontext-logging.
Diffstat (limited to 'core/src/test/scala')
-rw-r--r-- | core/src/test/scala/org/apache/spark/util/UtilsSuite.scala | 7 |
1 files changed, 2 insertions, 5 deletions
diff --git a/core/src/test/scala/org/apache/spark/util/UtilsSuite.scala b/core/src/test/scala/org/apache/spark/util/UtilsSuite.scala index 4dda80f10a..aeb2969fd5 100644 --- a/core/src/test/scala/org/apache/spark/util/UtilsSuite.scala +++ b/core/src/test/scala/org/apache/spark/util/UtilsSuite.scala @@ -843,14 +843,11 @@ class UtilsSuite extends SparkFunSuite with ResetSystemProperties with Logging { test("Set Spark CallerContext") { val context = "test" - try { + new CallerContext(context).setCurrentContext() + if (CallerContext.callerContextSupported) { val callerContext = Utils.classForName("org.apache.hadoop.ipc.CallerContext") - assert(new CallerContext(context).setCurrentContext()) assert(s"SPARK_$context" === callerContext.getMethod("getCurrent").invoke(null).toString) - } catch { - case e: ClassNotFoundException => - assert(!new CallerContext(context).setCurrentContext()) } } |