aboutsummaryrefslogtreecommitdiff
path: root/dev/audit-release/sbt_app_streaming
diff options
context:
space:
mode:
authorTathagata Das <tathagata.das1565@gmail.com>2014-05-22 20:48:55 -0700
committerTathagata Das <tathagata.das1565@gmail.com>2014-05-22 20:48:55 -0700
commitb2bdd0e505f1ae3d39c46139f17bd43779ece635 (patch)
treefd1f6274986c3bf259c0dd3a0adaf6c2cfddc1a1 /dev/audit-release/sbt_app_streaming
parentcce77457e00aa5f1f4db3d50454cf257efb156ed (diff)
downloadspark-b2bdd0e505f1ae3d39c46139f17bd43779ece635.tar.gz
spark-b2bdd0e505f1ae3d39c46139f17bd43779ece635.tar.bz2
spark-b2bdd0e505f1ae3d39c46139f17bd43779ece635.zip
Updated scripts for auditing releases
- Added script to automatically generate change list CHANGES.txt - Added test for verifying linking against maven distributions of `spark-sql` and `spark-hive` - Added SBT projects for testing functionality of `spark-sql` and `spark-hive` - Fixed issues in existing tests that might have come up because of changes in Spark 1.0 Author: Tathagata Das <tathagata.das1565@gmail.com> Closes #844 from tdas/update-dev-scripts and squashes the following commits: 25090ba [Tathagata Das] Added missing license e2e20b3 [Tathagata Das] Updated tests for auditing releases.
Diffstat (limited to 'dev/audit-release/sbt_app_streaming')
-rw-r--r--dev/audit-release/sbt_app_streaming/src/main/scala/StreamingApp.scala1
1 files changed, 0 insertions, 1 deletions
diff --git a/dev/audit-release/sbt_app_streaming/src/main/scala/StreamingApp.scala b/dev/audit-release/sbt_app_streaming/src/main/scala/StreamingApp.scala
index a1d8971abe..58a662bd9b 100644
--- a/dev/audit-release/sbt_app_streaming/src/main/scala/StreamingApp.scala
+++ b/dev/audit-release/sbt_app_streaming/src/main/scala/StreamingApp.scala
@@ -32,7 +32,6 @@ object SparkStreamingExample {
case None => new SparkConf().setAppName("Simple Streaming App")
}
val ssc = new StreamingContext(conf, Seconds(1))
- SparkContext.jarOfClass(this.getClass).foreach(ssc.sparkContext.addJar)
val seen = ListBuffer[RDD[Int]]()
val rdd1 = ssc.sparkContext.makeRDD(1 to 100, 10)