diff options
author | Shixiong Zhu <shixiong@databricks.com> | 2016-03-14 23:21:30 -0700 |
---|---|---|
committer | Reynold Xin <rxin@databricks.com> | 2016-03-14 23:21:30 -0700 |
commit | 43304b1758dec141b7fe9ed33cac976d75efdf91 (patch) | |
tree | c05690bea980854517baa6b0bb7495f87d844d9d /docs/streaming-custom-receivers.md | |
parent | e64958001cb95d53c441131f8c7a92556f49fd7d (diff) | |
download | spark-43304b1758dec141b7fe9ed33cac976d75efdf91.tar.gz spark-43304b1758dec141b7fe9ed33cac976d75efdf91.tar.bz2 spark-43304b1758dec141b7fe9ed33cac976d75efdf91.zip |
[SPARK-13888][DOC] Remove Akka Receiver doc and refer to the DStream Akka project
## What changes were proposed in this pull request?
I have copied the docs of Streaming Akka to https://github.com/spark-packages/dstream-akka/blob/master/README.md
So we can remove them from Spark now.
## How was this patch tested?
Only document changes.
(If this patch involves UI changes, please attach a screenshot; otherwise, remove this)
Author: Shixiong Zhu <shixiong@databricks.com>
Closes #11711 from zsxwing/remove-akka-doc.
Diffstat (limited to 'docs/streaming-custom-receivers.md')
-rw-r--r-- | docs/streaming-custom-receivers.md | 61 |
1 files changed, 0 insertions, 61 deletions
diff --git a/docs/streaming-custom-receivers.md b/docs/streaming-custom-receivers.md index 732c83dc84..a4e17fd24e 100644 --- a/docs/streaming-custom-receivers.md +++ b/docs/streaming-custom-receivers.md @@ -256,64 +256,3 @@ The following table summarizes the characteristics of both types of receivers <td></td> </tr> </table> - -## Implementing and Using a Custom Actor-based Receiver - -Custom [Akka Actors](http://doc.akka.io/docs/akka/2.3.11/scala/actors.html) can also be used to -receive data. Here are the instructions. - -1. **Linking:** You need to add the following dependency to your SBT or Maven project (see [Linking section](streaming-programming-guide.html#linking) in the main programming guide for further information). - - groupId = org.apache.spark - artifactId = spark-streaming-akka_{{site.SCALA_BINARY_VERSION}} - version = {{site.SPARK_VERSION_SHORT}} - -2. **Programming:** - - <div class="codetabs"> - <div data-lang="scala" markdown="1" > - - You need to extend [`ActorReceiver`](api/scala/index.html#org.apache.spark.streaming.akka.ActorReceiver) - so as to store received data into Spark using `store(...)` methods. The supervisor strategy of - this actor can be configured to handle failures, etc. - - class CustomActor extends ActorReceiver { - def receive = { - case data: String => store(data) - } - } - - // A new input stream can be created with this custom actor as - val ssc: StreamingContext = ... - val lines = AkkaUtils.createStream[String](ssc, Props[CustomActor](), "CustomReceiver") - - See [ActorWordCount.scala](https://github.com/apache/spark/blob/master/examples/src/main/scala/org/apache/spark/examples/streaming/ActorWordCount.scala) for an end-to-end example. - </div> - <div data-lang="java" markdown="1"> - - You need to extend [`JavaActorReceiver`](api/scala/index.html#org.apache.spark.streaming.akka.JavaActorReceiver) - so as to store received data into Spark using `store(...)` methods. The supervisor strategy of - this actor can be configured to handle failures, etc. - - class CustomActor extends JavaActorReceiver { - @Override - public void onReceive(Object msg) throws Exception { - store((String) msg); - } - } - - // A new input stream can be created with this custom actor as - JavaStreamingContext jssc = ...; - JavaDStream<String> lines = AkkaUtils.<String>createStream(jssc, Props.create(CustomActor.class), "CustomReceiver"); - - See [JavaActorWordCount.scala](https://github.com/apache/spark/blob/master/examples/src/main/scala/org/apache/spark/examples/streaming/JavaActorWordCount.scala) for an end-to-end example. - </div> - </div> - -3. **Deploying:** As with any Spark applications, `spark-submit` is used to launch your application. -You need to package `spark-streaming-akka_{{site.SCALA_BINARY_VERSION}}` and its dependencies into -the application JAR. Make sure `spark-core_{{site.SCALA_BINARY_VERSION}}` and `spark-streaming_{{site.SCALA_BINARY_VERSION}}` -are marked as `provided` dependencies as those are already present in a Spark installation. Then -use `spark-submit` to launch your application (see [Deploying section](streaming-programming-guide.html#deploying-applications) in the main programming guide). - -<span class="badge" style="background-color: grey">Python API</span> Since actors are available only in the Java and Scala libraries, AkkaUtils is not available in the Python API. |