diff options
author | hyukjinkwon <gurwls223@gmail.com> | 2016-04-18 13:45:03 -0700 |
---|---|---|
committer | Reynold Xin <rxin@databricks.com> | 2016-04-18 13:45:03 -0700 |
commit | 6fc1e72d9b70615bd91b598084406eb1893d6706 (patch) | |
tree | ecf9380e635d2adb05b810fbdb5a5b3cf6429f06 /examples/src/main/scala | |
parent | 8c62edb70fdeedf0ca5a7fc154698aea96184cc6 (diff) | |
download | spark-6fc1e72d9b70615bd91b598084406eb1893d6706.tar.gz spark-6fc1e72d9b70615bd91b598084406eb1893d6706.tar.bz2 spark-6fc1e72d9b70615bd91b598084406eb1893d6706.zip |
[MINOR] Revert removing explicit typing (changed in some examples and StatFunctions)
## What changes were proposed in this pull request?
This PR reverts some changes in https://github.com/apache/spark/pull/12413. (please see the discussion in that PR).
from
```scala
words.foreachRDD { (rdd, time) =>
...
```
to
```scala
words.foreachRDD { (rdd: RDD[String], time: Time) =>
...
```
Also, this was discussed in dev-mailing list, [here](http://apache-spark-developers-list.1001551.n3.nabble.com/Question-about-Scala-style-explicit-typing-within-transformation-functions-and-anonymous-val-td17173.html)
## How was this patch tested?
This was tested with `sbt scalastyle`.
Author: hyukjinkwon <gurwls223@gmail.com>
Closes #12452 from HyukjinKwon/revert-explicit-typing.
Diffstat (limited to 'examples/src/main/scala')
2 files changed, 2 insertions, 2 deletions
diff --git a/examples/src/main/scala/org/apache/spark/examples/streaming/RecoverableNetworkWordCount.scala b/examples/src/main/scala/org/apache/spark/examples/streaming/RecoverableNetworkWordCount.scala index aa762b27dc..1bcd85e1d5 100644 --- a/examples/src/main/scala/org/apache/spark/examples/streaming/RecoverableNetworkWordCount.scala +++ b/examples/src/main/scala/org/apache/spark/examples/streaming/RecoverableNetworkWordCount.scala @@ -116,7 +116,7 @@ object RecoverableNetworkWordCount { val lines = ssc.socketTextStream(ip, port) val words = lines.flatMap(_.split(" ")) val wordCounts = words.map((_, 1)).reduceByKey(_ + _) - wordCounts.foreachRDD { (rdd, time) => + wordCounts.foreachRDD { (rdd: RDD[(String, Int)], time: Time) => // Get or register the blacklist Broadcast val blacklist = WordBlacklist.getInstance(rdd.sparkContext) // Get or register the droppedWordsCounter Accumulator diff --git a/examples/src/main/scala/org/apache/spark/examples/streaming/SqlNetworkWordCount.scala b/examples/src/main/scala/org/apache/spark/examples/streaming/SqlNetworkWordCount.scala index ad6a89e320..918e124065 100644 --- a/examples/src/main/scala/org/apache/spark/examples/streaming/SqlNetworkWordCount.scala +++ b/examples/src/main/scala/org/apache/spark/examples/streaming/SqlNetworkWordCount.scala @@ -59,7 +59,7 @@ object SqlNetworkWordCount { val words = lines.flatMap(_.split(" ")) // Convert RDDs of the words DStream to DataFrame and run SQL query - words.foreachRDD { (rdd, time) => + words.foreachRDD { (rdd: RDD[String], time: Time) => // Get the singleton instance of SQLContext val sqlContext = SQLContextSingleton.getInstance(rdd.sparkContext) import sqlContext.implicits._ |