diff options
author | jerryshao <sshao@hortonworks.com> | 2017-02-10 13:44:26 +0000 |
---|---|---|
committer | Sean Owen <sowen@cloudera.com> | 2017-02-10 13:44:26 +0000 |
commit | 8e8afb3a3468aa743d13e23e10e77e94b772b2ed (patch) | |
tree | c6fa07c629545e5580d309603cca154d43efac99 /resource-managers | |
parent | d5593f7f5794bd0343e783ac4957864fed9d1b38 (diff) | |
download | spark-8e8afb3a3468aa743d13e23e10e77e94b772b2ed.tar.gz spark-8e8afb3a3468aa743d13e23e10e77e94b772b2ed.tar.bz2 spark-8e8afb3a3468aa743d13e23e10e77e94b772b2ed.zip |
[SPARK-19545][YARN] Fix compile issue for Spark on Yarn when building against Hadoop 2.6.0~2.6.3
## What changes were proposed in this pull request?
Due to the newly added API in Hadoop 2.6.4+, Spark builds against Hadoop 2.6.0~2.6.3 will meet compile error. So here still reverting back to use reflection to handle this issue.
## How was this patch tested?
Manual verification.
Author: jerryshao <sshao@hortonworks.com>
Closes #16884 from jerryshao/SPARK-19545.
Diffstat (limited to 'resource-managers')
-rw-r--r-- | resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/Client.scala | 25 |
1 files changed, 20 insertions, 5 deletions
diff --git a/resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/Client.scala b/resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/Client.scala index 635c1ac5e3..70826ed326 100644 --- a/resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/Client.scala +++ b/resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/Client.scala @@ -245,12 +245,27 @@ private[spark] class Client( } sparkConf.get(ROLLED_LOG_INCLUDE_PATTERN).foreach { includePattern => - val logAggregationContext = Records.newRecord(classOf[LogAggregationContext]) - logAggregationContext.setRolledLogsIncludePattern(includePattern) - sparkConf.get(ROLLED_LOG_EXCLUDE_PATTERN).foreach { excludePattern => - logAggregationContext.setRolledLogsExcludePattern(excludePattern) + try { + val logAggregationContext = Records.newRecord(classOf[LogAggregationContext]) + + // These two methods were added in Hadoop 2.6.4, so we still need to use reflection to + // avoid compile error when building against Hadoop 2.6.0 ~ 2.6.3. + val setRolledLogsIncludePatternMethod = + logAggregationContext.getClass.getMethod("setRolledLogsIncludePattern", classOf[String]) + setRolledLogsIncludePatternMethod.invoke(logAggregationContext, includePattern) + + sparkConf.get(ROLLED_LOG_EXCLUDE_PATTERN).foreach { excludePattern => + val setRolledLogsExcludePatternMethod = + logAggregationContext.getClass.getMethod("setRolledLogsExcludePattern", classOf[String]) + setRolledLogsExcludePatternMethod.invoke(logAggregationContext, excludePattern) + } + + appContext.setLogAggregationContext(logAggregationContext) + } catch { + case NonFatal(e) => + logWarning(s"Ignoring ${ROLLED_LOG_INCLUDE_PATTERN.key} because the version of YARN " + + "does not support it", e) } - appContext.setLogAggregationContext(logAggregationContext) } appContext |