diff options
author | Thomas Graves <tgraves@apache.org> | 2016-01-08 14:38:19 -0600 |
---|---|---|
committer | Tom Graves <tgraves@yahoo-inc.com> | 2016-01-08 14:38:19 -0600 |
commit | 553fd7b912a32476b481fd3f80c1d0664b6c6484 (patch) | |
tree | e3cbc5f693c18175be7bb06ebce274bf757a9f57 /extras/java8-tests | |
parent | 8c70cb4c62a353bea99f37965dfc829c4accc391 (diff) | |
download | spark-553fd7b912a32476b481fd3f80c1d0664b6c6484.tar.gz spark-553fd7b912a32476b481fd3f80c1d0664b6c6484.tar.bz2 spark-553fd7b912a32476b481fd3f80c1d0664b6c6484.zip |
[SPARK-12654] sc.wholeTextFiles with spark.hadoop.cloneConf=true fail…
…s on secure Hadoop
https://issues.apache.org/jira/browse/SPARK-12654
So the bug here is that WholeTextFileRDD.getPartitions has:
val conf = getConf
in getConf if the cloneConf=true it creates a new Hadoop Configuration. Then it uses that to create a new newJobContext.
The newJobContext will copy credentials around, but credentials are only present in a JobConf not in a Hadoop Configuration. So basically when it is cloning the hadoop configuration its changing it from a JobConf to Configuration and dropping the credentials that were there. NewHadoopRDD just uses the conf passed in for the getPartitions (not getConf) which is why it works.
Author: Thomas Graves <tgraves@staydecay.corp.gq1.yahoo.com>
Closes #10651 from tgravescs/SPARK-12654.
Diffstat (limited to 'extras/java8-tests')
0 files changed, 0 insertions, 0 deletions