aboutsummaryrefslogtreecommitdiff
path: root/examples/src/main/scala
Commit message (Collapse)AuthorAgeFilesLines
* Merge branch 'master' into removesemicolonscalaHenry Saputra2013-11-192-11/+14
|\
| * Enable the Broadcast examples to work in a cluster settingAaron Davidson2013-11-182-11/+14
| | | | | | | | | | Since they rely on println to display results, we need to first collect those results to the driver to have them actually display locally.
* | Remove the semicolons at the end of Scala code to make it more pure Scala code.Henry Saputra2013-11-194-5/+5
|/ | | | | | | Also remove unused imports as I found them along the way. Remove return statements when returning value in the Scala code. Passing compile and tests.
* fix sparkhdfs lr testtgravescs2013-10-291-1/+2
|
* Makes Spark SIMR ready.Ali Ghodsi2013-10-241-1/+1
|
* Merge pull request #64 from prabeesh/masterMatei Zaharia2013-10-231-0/+107
|\ | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | MQTT Adapter for Spark Streaming MQTT is a machine-to-machine (M2M)/Internet of Things connectivity protocol. It was designed as an extremely lightweight publish/subscribe messaging transport. You may read more about it here http://mqtt.org/ Message Queue Telemetry Transport (MQTT) is an open message protocol for M2M communications. It enables the transfer of telemetry-style data in the form of messages from devices like sensors and actuators, to mobile phones, embedded systems on vehicles, or laptops and full scale computers. The protocol was invented by Andy Stanford-Clark of IBM, and Arlen Nipper of Cirrus Link Solutions This protocol enables a publish/subscribe messaging model in an extremely lightweight way. It is useful for connections with remote locations where line of code and network bandwidth is a constraint. MQTT is one of the widely used protocol for 'Internet of Things'. This protocol is getting much attraction as anything and everything is getting connected to internet and they all produce data. Researchers and companies predict some 25 billion devices will be connected to the internet by 2015. Plugin/Support for MQTT is available in popular MQs like RabbitMQ, ActiveMQ etc. Support for MQTT in Spark will help people with Internet of Things (IoT) projects to use Spark Streaming for their real time data processing needs (from sensors and other embedded devices etc).
| * Update MQTTWordCount.scalaPrabeesh K2013-10-221-6/+1
| |
| * Update MQTTWordCount.scalaPrabeesh K2013-10-221-3/+4
| |
| * Update MQTTWordCount.scalaPrabeesh K2013-10-181-15/+14
| |
| * added mqtt adapter wordcount exampleprabeesh2013-10-161-0/+112
| |
* | Merge pull request #56 from jerryshao/kafka-0.8-devMatei Zaharia2013-10-211-13/+15
|\ \ | | | | | | | | | | | | | | | | | | Upgrade Kafka 0.7.2 to Kafka 0.8.0-beta1 for Spark Streaming Conflicts: streaming/pom.xml
| * | Upgrade Kafka 0.7.2 to Kafka 0.8.0-beta1 for Spark Streamingjerryshao2013-10-121-13/+15
| | |
* | | BroadcastTest2 --> BroadcastTestMosharaf Chowdhury2013-10-162-62/+12
| | |
* | | Default blockSize is 4MB.Mosharaf Chowdhury2013-10-161-0/+59
| | | | | | | | | | | | BroadcastTest2 example added for testing broadcasts.
* | | Fixing spark streaming example and a bug in examples build.Patrick Wendell2013-10-151-4/+9
|/ / | | | | | | | | | | - Examples assembly included a log4j.properties which clobbered Spark's - Example had an error where some classes weren't serializable - Did some other clean-up in this example
* / Remove unnecessary mutable importsNeal Wiggins2013-10-111-2/+0
|/
* Add missing license headers found with RATMatei Zaharia2013-09-021-0/+17
|
* Move some classes to more appropriate packages:Matei Zaharia2013-09-015-13/+11
| | | | | | * RDD, *RDDFunctions -> org.apache.spark.rdd * Utils, ClosureCleaner, SizeEstimator -> org.apache.spark.util * JavaSerializer, KryoSerializer -> org.apache.spark.serializer
* Initial work to rename package to org.apache.sparkMatei Zaharia2013-09-0139-129/+129
|
* Fix finding of assembly JAR, as well as some pointers to ./runMatei Zaharia2013-08-298-13/+13
|
* Change build and run instructions to use assembliesMatei Zaharia2013-08-293-0/+447
| | | | | | | | | | | | | | | | This commit makes Spark invocation saner by using an assembly JAR to find all of Spark's dependencies instead of adding all the JARs in lib_managed. It also packages the examples into an assembly and uses that as SPARK_EXAMPLES_JAR. Finally, it replaces the old "run" script with two better-named scripts: "run-examples" for examples, and "spark-class" for Spark internal classes (e.g. REPL, master, etc). This is also designed to minimize the confusion people have in trying to use "run" to run their own classes; it's not meant to do that, but now at least if they look at it, they can modify run-examples to do a decent job for them. As part of this, Bagel's examples are also now properly moved to the examples package instead of bagel.
* make SparkHadoopUtil a member of SparkEnvJey Kottalam2013-08-151-2/+1
|
* Optimize Scala PageRank to use reduceByKeyMatei Zaharia2013-08-101-8/+4
|
* Style changes as per Matei's commentsNick Pentreath2013-08-081-9/+8
|
* Adding Scala version of PageRank exampleNick Pentreath2013-08-071-0/+51
|
* Add Apache license headers and LICENSE and NOTICE filesMatei Zaharia2013-07-1635-1/+596
|
* Merge pull request #577 from skumargithub/masterMatei Zaharia2013-06-291-0/+50
|\ | | | | Example of cumulative counting using updateStateByKey
| * Removed unused code, clarified intent of the program, batch size to 1 secondunknown2013-05-061-5/+3
| |
| * Modified as per TD's suggestionsunknown2013-04-301-17/+6
| |
| * Examaple of cumulative counting using updateStateByKeyunknown2013-04-221-0/+63
| |
* | Merge remote-tracking branch 'mrpotes/master'Matei Zaharia2013-06-293-15/+12
|\ \
| * | Fix usage and parameter extractionJames Phillpotts2013-06-253-12/+9
| | |
| * | Include a default OAuth implementation, and update examples and ↵James Phillpotts2013-06-253-3/+3
| | | | | | | | | | | | JavaStreamingContext
* | | Merge branch 'master' into streamingTathagata Das2013-06-2431-155/+441
|\| | | | | | | | | | | | | | Conflicts: .gitignore
| * | Merge remote-tracking branch 'milliondreams/casdemo'Matei Zaharia2013-06-181-0/+196
| |\ \ | | | | | | | | | | | | | | | | Conflicts: project/SparkBuild.scala
| | * | Fixing the style as per feedbackRohit Rai2013-06-131-35/+37
| | | |
| | * | Example to write the output to cassandraRohit Rai2013-06-031-5/+43
| | | |
| | * | A better way to read column value if you are sure the column exists in every ↵Rohit Rai2013-06-031-2/+4
| | | | | | | | | | | | | | | | row.
| | * | Removing infix callRohit Rai2013-06-021-3/+3
| | | |
| | * | Adding example to make Spark RDD from CassandraRohit Rai2013-06-011-0/+154
| | | |
| * | | Add hBase exampleEthan Jewett2013-05-091-0/+35
| | | |
| * | | Revert "Merge pull request #596 from esjewett/master" because theReynold Xin2013-05-091-35/+0
| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | dependency on hbase introduces netty-3.2.2 which conflicts with netty-3.5.3 already in Spark. This caused multiple test failures. This reverts commit 0f1b7a06e1f6782711170234f105f1b277e3b04c, reversing changes made to aacca1b8a85bd073ce185a06d6470b070761b2f4.
| * | | Switch to using SparkContext method to create RDDEthan Jewett2013-05-071-2/+2
| | | |
| * | | Fix indents and mention other configuration optionsEthan Jewett2013-05-041-2/+5
| | | |
| * | | Remove unnecessary column family configEthan Jewett2013-05-041-4/+2
| | | |
| * | | HBase exampleEthan Jewett2013-05-041-0/+34
| |/ /
| * | Attempt at fixing merge conflictMridul Muralidharan2013-04-245-77/+77
| |\|
| | * Uniform whitespace across scala examplesAndrew Ash2013-04-094-76/+76
| | |
| | * Corrected order of CountMinSketchMonoid argumentsErik van oosten2013-04-021-1/+1
| | |
| * | Fix review comments, add a new api to SparkHadoopUtil to create appropriate ↵Mridul Muralidharan2013-04-221-2/+8
| |/ | | | | | | Configuration. Modify an example to show how to use SplitInfo