diff options
author | hyukjinkwon <gurwls223@gmail.com> | 2016-06-02 11:16:24 -0500 |
---|---|---|
committer | Sean Owen <sowen@cloudera.com> | 2016-06-02 11:16:24 -0500 |
commit | 252417fa21eb47781addfd614ff00dac793b52a9 (patch) | |
tree | 679129d3ba18ad804dc764a1107f14c271dbb1ea /core | |
parent | b85d18f3bdedca7ae7f2c26ff64ce38c2796bd63 (diff) | |
download | spark-252417fa21eb47781addfd614ff00dac793b52a9.tar.gz spark-252417fa21eb47781addfd614ff00dac793b52a9.tar.bz2 spark-252417fa21eb47781addfd614ff00dac793b52a9.zip |
[SPARK-15322][SQL][FOLLOWUP] Use the new long accumulator for old int accumulators.
## What changes were proposed in this pull request?
This PR corrects the remaining cases for using old accumulators.
This does not change some old accumulator usages below:
- `ImplicitSuite.scala` - Tests dedicated to old accumulator, for implicits with `AccumulatorParam`
- `AccumulatorSuite.scala` - Tests dedicated to old accumulator
- `JavaSparkContext.scala` - For supporting old accumulators for Java API.
- `debug.package.scala` - Usage with `HashSet[String]`. Currently, it seems no implementation for this. I might be able to write an anonymous class for this but I didn't because I think it is not worth writing a lot of codes only for this.
- `SQLMetricsSuite.scala` - This uses the old accumulator for checking type boxing. It seems new accumulator does not require type boxing for this case whereas the old one requires (due to the use of generic).
## How was this patch tested?
Existing tests cover this.
Author: hyukjinkwon <gurwls223@gmail.com>
Closes #13434 from HyukjinKwon/accum.
Diffstat (limited to 'core')
-rw-r--r-- | core/src/test/scala/org/apache/spark/DistributedSuite.scala | 5 |
1 files changed, 2 insertions, 3 deletions
diff --git a/core/src/test/scala/org/apache/spark/DistributedSuite.scala b/core/src/test/scala/org/apache/spark/DistributedSuite.scala index 0be25e9f89..6e69fc4247 100644 --- a/core/src/test/scala/org/apache/spark/DistributedSuite.scala +++ b/core/src/test/scala/org/apache/spark/DistributedSuite.scala @@ -92,8 +92,8 @@ class DistributedSuite extends SparkFunSuite with Matchers with LocalSparkContex test("accumulators") { sc = new SparkContext(clusterUrl, "test") - val accum = sc.accumulator(0) - sc.parallelize(1 to 10, 10).foreach(x => accum += x) + val accum = sc.longAccumulator + sc.parallelize(1 to 10, 10).foreach(x => accum.add(x)) assert(accum.value === 55) } @@ -109,7 +109,6 @@ class DistributedSuite extends SparkFunSuite with Matchers with LocalSparkContex test("repeatedly failing task") { sc = new SparkContext(clusterUrl, "test") - val accum = sc.accumulator(0) val thrown = intercept[SparkException] { // scalastyle:off println sc.parallelize(1 to 10, 10).foreach(x => println(x / 0)) |