diff options
author | Nathan Kronenfeld <nkronenfeld@oculusinfo.com> | 2014-10-05 21:03:48 -0700 |
---|---|---|
committer | Patrick Wendell <pwendell@gmail.com> | 2014-10-05 21:03:48 -0700 |
commit | fd7b15539669b14996a51610d6724ca0811f9d65 (patch) | |
tree | 1491a0cccc35e393ec9c98793e7c3aedbc32b11c /core | |
parent | 8d22dbb5ec7a0727afdfebbbc2c57ffdb384dd0b (diff) | |
download | spark-fd7b15539669b14996a51610d6724ca0811f9d65.tar.gz spark-fd7b15539669b14996a51610d6724ca0811f9d65.tar.bz2 spark-fd7b15539669b14996a51610d6724ca0811f9d65.zip |
Rectify gereneric parameter names between SparkContext and AccumulablePa...
AccumulableParam gave its generic parameters as 'R, T', whereas SparkContext labeled them 'T, R'.
Trivial, but really confusing.
I resolved this in favor of AccumulableParam, because it seemed to have some logic for its names. I also extended this minimal, but at least present, justification into the SparkContext comments.
Author: Nathan Kronenfeld <nkronenfeld@oculusinfo.com>
Closes #2637 from nkronenfeld/accumulators and squashes the following commits:
98d6b74 [Nathan Kronenfeld] Rectify gereneric parameter names between SparkContext and AccumulableParam
Diffstat (limited to 'core')
-rw-r--r-- | core/src/main/scala/org/apache/spark/SparkContext.scala | 12 |
1 files changed, 6 insertions, 6 deletions
diff --git a/core/src/main/scala/org/apache/spark/SparkContext.scala b/core/src/main/scala/org/apache/spark/SparkContext.scala index 97109b9f41..396cdd1247 100644 --- a/core/src/main/scala/org/apache/spark/SparkContext.scala +++ b/core/src/main/scala/org/apache/spark/SparkContext.scala @@ -779,20 +779,20 @@ class SparkContext(config: SparkConf) extends Logging { /** * Create an [[org.apache.spark.Accumulable]] shared variable, to which tasks can add values * with `+=`. Only the driver can access the accumuable's `value`. - * @tparam T accumulator type - * @tparam R type that can be added to the accumulator + * @tparam R accumulator result type + * @tparam T type that can be added to the accumulator */ - def accumulable[T, R](initialValue: T)(implicit param: AccumulableParam[T, R]) = + def accumulable[R, T](initialValue: R)(implicit param: AccumulableParam[R, T]) = new Accumulable(initialValue, param) /** * Create an [[org.apache.spark.Accumulable]] shared variable, with a name for display in the * Spark UI. Tasks can add values to the accumuable using the `+=` operator. Only the driver can * access the accumuable's `value`. - * @tparam T accumulator type - * @tparam R type that can be added to the accumulator + * @tparam R accumulator result type + * @tparam T type that can be added to the accumulator */ - def accumulable[T, R](initialValue: T, name: String)(implicit param: AccumulableParam[T, R]) = + def accumulable[R, T](initialValue: R, name: String)(implicit param: AccumulableParam[R, T]) = new Accumulable(initialValue, param, Some(name)) /** |