aboutsummaryrefslogtreecommitdiff
path: root/tools/src
Commit message (Collapse)AuthorAgeFilesLines
* Merge pull request #557 from ScrapCodes/style. Closes #557.Patrick Wendell2014-02-091-2/+4
| | | | | | | | | | | | | | | | | | | | | SPARK-1058, Fix Style Errors and Add Scala Style to Spark Build. Author: Patrick Wendell <pwendell@gmail.com> Author: Prashant Sharma <scrapcodes@gmail.com> == Merge branch commits == commit 1a8bd1c059b842cb95cc246aaea74a79fec684f4 Author: Prashant Sharma <scrapcodes@gmail.com> Date: Sun Feb 9 17:39:07 2014 +0530 scala style fixes commit f91709887a8e0b608c5c2b282db19b8a44d53a43 Author: Patrick Wendell <pwendell@gmail.com> Date: Fri Jan 24 11:22:53 2014 -0800 Adding scalastyle snapshot
* Fixed import formatting.Tathagata Das2014-01-121-1/+1
|
* Moved DStream, DStreamCheckpointData and PairDStream from ↵Tathagata Das2014-01-121-25/+25
| | | | org.apache.spark.streaming to org.apache.spark.streaming.dstream.
* Merge branch 'master' into scala-2.10Raymond Liu2013-11-131-2/+2
|
* Move some classes to more appropriate packages:Matei Zaharia2013-09-011-19/+19
| | | | | | * RDD, *RDDFunctions -> org.apache.spark.rdd * Utils, ClosureCleaner, SizeEstimator -> org.apache.spark.util * JavaSerializer, KryoSerializer -> org.apache.spark.serializer
* Initial work to rename package to org.apache.sparkMatei Zaharia2013-09-011-92/+92
|
* Made PairRDDFunctions taking only Tuple2, but made the rest of the shuffle ↵Reynold Xin2013-08-191-1/+1
| | | | code path working with general Product2.
* Allow subclasses of Product2 in all key-value related classes ↵Reynold Xin2013-08-181-2/+4
| | | | (ShuffleDependency, PairRDDFunctions, etc).
* Change scala.Option to Guava Optional in Java APIs.Josh Rosen2013-08-111-15/+19
|
* Remove annotation code that broke build.Josh Rosen2013-07-221-5/+0
|
* Add JavaAPICompletenessChecker.Josh Rosen2013-07-221-0/+359
This is used to find methods in the Scala API that need to be ported to the Java API. To use it: ./run spark.tools.JavaAPICompletenessChecker Conflicts: project/SparkBuild.scala run run2.cmd