aboutsummaryrefslogtreecommitdiff
path: root/examples
Commit message (Collapse)AuthorAgeFilesLines
* Style fixes and addressed review comments at #221Prashant Sharma2013-12-101-10/+10
|
* Incorporated Patrick's feedback comment on #211 and made maven ↵Prashant Sharma2013-12-071-1/+1
| | | | build/dep-resolution atleast a bit faster.
* Merge branch 'master' of github.com:apache/incubator-spark into scala-2.10-tempPrashant Sharma2013-11-216-16/+19
|\ | | | | | | | | | | Conflicts: core/src/main/scala/org/apache/spark/util/collection/PrimitiveVector.scala streaming/src/main/scala/org/apache/spark/streaming/api/java/JavaStreamingContext.scala
| * Merge branch 'master' into removesemicolonscalaHenry Saputra2013-11-192-11/+14
| |\
| | * Enable the Broadcast examples to work in a cluster settingAaron Davidson2013-11-182-11/+14
| | | | | | | | | | | | | | | Since they rely on println to display results, we need to first collect those results to the driver to have them actually display locally.
| * | Remove the semicolons at the end of Scala code to make it more pure Scala code.Henry Saputra2013-11-194-5/+5
| |/ | | | | | | | | | | | | Also remove unused imports as I found them along the way. Remove return statements when returning value in the Scala code. Passing compile and tests.
* | Merge branch 'scala210-master' of github.com:colorant/incubator-spark into ↵Prashant Sharma2013-11-219-30/+274
|\ \ | | | | | | | | | | | | | | | | | | | | | | | | | | | scala-2.10 Conflicts: core/src/main/scala/org/apache/spark/deploy/client/Client.scala core/src/main/scala/org/apache/spark/deploy/worker/Worker.scala core/src/main/scala/org/apache/spark/executor/CoarseGrainedExecutorBackend.scala core/src/test/scala/org/apache/spark/MapOutputTrackerSuite.scala
| * | Merge branch 'master' into scala-2.10Raymond Liu2013-11-139-30/+274
| |\|
| | * fix sparkhdfs lr testtgravescs2013-10-291-1/+2
| | |
| | * Makes Spark SIMR ready.Ali Ghodsi2013-10-241-1/+1
| | |
| | * Merge pull request #64 from prabeesh/masterMatei Zaharia2013-10-231-0/+107
| | |\ | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | MQTT Adapter for Spark Streaming MQTT is a machine-to-machine (M2M)/Internet of Things connectivity protocol. It was designed as an extremely lightweight publish/subscribe messaging transport. You may read more about it here http://mqtt.org/ Message Queue Telemetry Transport (MQTT) is an open message protocol for M2M communications. It enables the transfer of telemetry-style data in the form of messages from devices like sensors and actuators, to mobile phones, embedded systems on vehicles, or laptops and full scale computers. The protocol was invented by Andy Stanford-Clark of IBM, and Arlen Nipper of Cirrus Link Solutions This protocol enables a publish/subscribe messaging model in an extremely lightweight way. It is useful for connections with remote locations where line of code and network bandwidth is a constraint. MQTT is one of the widely used protocol for 'Internet of Things'. This protocol is getting much attraction as anything and everything is getting connected to internet and they all produce data. Researchers and companies predict some 25 billion devices will be connected to the internet by 2015. Plugin/Support for MQTT is available in popular MQs like RabbitMQ, ActiveMQ etc. Support for MQTT in Spark will help people with Internet of Things (IoT) projects to use Spark Streaming for their real time data processing needs (from sensors and other embedded devices etc).
| | | * Update MQTTWordCount.scalaPrabeesh K2013-10-221-6/+1
| | | |
| | | * Update MQTTWordCount.scalaPrabeesh K2013-10-221-3/+4
| | | |
| | | * Update MQTTWordCount.scalaPrabeesh K2013-10-181-15/+14
| | | |
| | | * remove unused dependencyprabeesh2013-10-171-5/+0
| | | |
| | | * add maven dependencies for mqttprabeesh2013-10-161-0/+5
| | | |
| | | * added mqtt adapter wordcount exampleprabeesh2013-10-161-0/+112
| | | |
| | * | Merge pull request #56 from jerryshao/kafka-0.8-devMatei Zaharia2013-10-213-19/+135
| | |\ \ | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | Upgrade Kafka 0.7.2 to Kafka 0.8.0-beta1 for Spark Streaming Conflicts: streaming/pom.xml
| | | * | Upgrade Kafka 0.7.2 to Kafka 0.8.0-beta1 for Spark Streamingjerryshao2013-10-123-19/+135
| | | | |
| | * | | Exclusion rules for Maven build files.Reynold Xin2013-10-191-0/+8
| | | | |
| | * | | BroadcastTest2 --> BroadcastTestMosharaf Chowdhury2013-10-162-62/+12
| | | | |
| | * | | Default blockSize is 4MB.Mosharaf Chowdhury2013-10-161-0/+59
| | | | | | | | | | | | | | | | | | | | BroadcastTest2 example added for testing broadcasts.
| | * | | Fixing spark streaming example and a bug in examples build.Patrick Wendell2013-10-151-4/+9
| | |/ / | | | | | | | | | | | | | | | | | | | | - Examples assembly included a log4j.properties which clobbered Spark's - Example had an error where some classes weren't serializable - Did some other clean-up in this example
| | * | Remove unnecessary mutable importsNeal Wiggins2013-10-111-2/+0
| | | |
* | | | Remove deprecated actorFor and use actorSelection everywhere.Prashant Sharma2013-11-121-1/+1
|/ / /
* | | Merge branch 'scala-2.10' of github.com:ScrapCodes/spark into scala-2.10Prashant Sharma2013-10-101-7/+21
|\ \ \ | | | | | | | | | | | | | | | | | | | | Conflicts: core/src/main/scala/org/apache/spark/scheduler/cluster/ClusterTaskSetManager.scala project/SparkBuild.scala
| * | | Merge branch 'master' into wip-merge-masterPrashant Sharma2013-10-081-6/+20
| |\| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | Conflicts: bagel/pom.xml core/pom.xml core/src/test/scala/org/apache/spark/ui/UISuite.scala examples/pom.xml mllib/pom.xml pom.xml project/SparkBuild.scala repl/pom.xml streaming/pom.xml tools/pom.xml In scala 2.10, a shorter representation is used for naming artifacts so changed to shorter scala version for artifacts and made it a property in pom.
| | * | Merging build changes in from 0.8Patrick Wendell2013-10-051-8/+22
| | | |
| * | | Merge branch 'master' into scala-2.10Prashant Sharma2013-10-011-1/+1
| |\| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | Conflicts: core/src/main/scala/org/apache/spark/ui/jobs/JobProgressUI.scala docs/_config.yml project/SparkBuild.scala repl/src/main/scala/org/apache/spark/repl/SparkILoop.scala
| | * | Update build version in masterPatrick Wendell2013-09-241-1/+1
| | | |
* | | | fixed some warningsMartin Weindel2013-10-054-5/+3
|/ / /
* | | Sync with master and some build fixesPrashant Sharma2013-09-261-1/+1
|\| |
* | | fixed maven build for scala 2.10Prashant Sharma2013-09-261-5/+5
| | |
* | | Akka 2.2 migrationPrashant Sharma2013-09-222-4/+6
| | |
* | | Merge branch 'master' of git://github.com/mesos/spark into scala-2.10Prashant Sharma2013-09-151-14/+0
|\| | | | | | | | | | | | | | | | | Conflicts: core/src/main/scala/org/apache/spark/SparkContext.scala project/SparkBuild.scala
| * | Minor YARN build cleanupsJey Kottalam2013-09-061-14/+0
| |/
* | Merged with masterPrashant Sharma2013-09-0659-494/+2086
|\|
| * Add missing license headers found with RATMatei Zaharia2013-09-021-0/+17
| |
| * Move some classes to more appropriate packages:Matei Zaharia2013-09-015-13/+11
| | | | | | | | | | | | * RDD, *RDDFunctions -> org.apache.spark.rdd * Utils, ClosureCleaner, SizeEstimator -> org.apache.spark.util * JavaSerializer, KryoSerializer -> org.apache.spark.serializer
| * Fix some URLsMatei Zaharia2013-09-011-1/+1
| |
| * Initial work to rename package to org.apache.sparkMatei Zaharia2013-09-0153-220/+220
| |
| * Update Maven build to create assemblies expected by new scriptsMatei Zaharia2013-08-291-6/+56
| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | This includes the following changes: - The "assembly" package now builds in Maven by default, and creates an assembly containing both hadoop-client and Spark, unlike the old BigTop distribution assembly that skipped hadoop-client - There is now a bigtop-dist package to build the old BigTop assembly - The repl-bin package is no longer built by default since the scripts don't reply on it; instead it can be enabled with -Prepl-bin - Py4J is now included in the assembly/lib folder as a local Maven repo, so that the Maven package can link to it - run-example now adds the original Spark classpath as well because the Maven examples assembly lists spark-core and such as provided - The various Maven projects add a spark-yarn dependency correctly
| * Fix finding of assembly JAR, as well as some pointers to ./runMatei Zaharia2013-08-298-13/+13
| |
| * Change build and run instructions to use assembliesMatei Zaharia2013-08-294-0/+452
| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | This commit makes Spark invocation saner by using an assembly JAR to find all of Spark's dependencies instead of adding all the JARs in lib_managed. It also packages the examples into an assembly and uses that as SPARK_EXAMPLES_JAR. Finally, it replaces the old "run" script with two better-named scripts: "run-examples" for examples, and "spark-class" for Spark internal classes (e.g. REPL, master, etc). This is also designed to minimize the confusion people have in trying to use "run" to run their own classes; it's not meant to do that, but now at least if they look at it, they can modify run-examples to do a decent job for them. As part of this, Bagel's examples are also now properly moved to the examples package instead of bagel.
| * Remove redundant dependencies from POMsJey Kottalam2013-08-181-4/+0
| |
| * Updates to repl and example POMs to match SBT buildJey Kottalam2013-08-161-0/+10
| |
| * Maven build now also works with YARNJey Kottalam2013-08-161-57/+0
| |
| * Don't mark hadoop-client as 'provided'Jey Kottalam2013-08-161-1/+0
| |
| * Maven build now works with CDH hadoop-2.0.0-mr1Jey Kottalam2013-08-161-44/+0
| |
| * Initial changes to make Maven build agnostic of hadoop versionJey Kottalam2013-08-161-84/+60
| |