aboutsummaryrefslogtreecommitdiff
path: root/docs
diff options
context:
space:
mode:
authorw00228970 <wangfei1@huawei.com>2014-10-12 23:35:50 -0700
committerJosh Rosen <joshrosen@apache.org>2014-10-12 23:35:50 -0700
commit92e017fb894be1e8e2b2b5274fec4c31a7a4412e (patch)
treee89078fca01287d31c6ad8be3fa719b16285ae61 /docs
parentd8b8c210786dfb905d06ea0a21d633f7772d5d1a (diff)
downloadspark-92e017fb894be1e8e2b2b5274fec4c31a7a4412e.tar.gz
spark-92e017fb894be1e8e2b2b5274fec4c31a7a4412e.tar.bz2
spark-92e017fb894be1e8e2b2b5274fec4c31a7a4412e.zip
[SPARK-3899][Doc]fix wrong links in streaming doc
There are three [Custom Receiver Guide] links in streaming doc, the first is wrong. Author: w00228970 <wangfei1@huawei.com> Author: wangfei <wangfei1@huawei.com> Closes #2749 from scwf/streaming-doc and squashes the following commits: 0cd76b7 [wangfei] update link tojump to the Akka-specific section 45b0646 [w00228970] wrong link in streaming doc
Diffstat (limited to 'docs')
-rw-r--r--docs/streaming-programming-guide.md2
1 files changed, 1 insertions, 1 deletions
diff --git a/docs/streaming-programming-guide.md b/docs/streaming-programming-guide.md
index 5c21e912ea..738309c668 100644
--- a/docs/streaming-programming-guide.md
+++ b/docs/streaming-programming-guide.md
@@ -494,7 +494,7 @@ methods for creating DStreams from files and Akka actors as input sources.
For simple text files, there is an easier method `streamingContext.textFileStream(dataDirectory)`. And file streams do not require running a receiver, hence does not require allocating cores.
-- **Streams based on Custom Actors:** DStreams can be created with data streams received through Akka actors by using `streamingContext.actorStream(actorProps, actor-name)`. See the [Custom Receiver Guide](#implementing-and-using-a-custom-actor-based-receiver) for more details.
+- **Streams based on Custom Actors:** DStreams can be created with data streams received through Akka actors by using `streamingContext.actorStream(actorProps, actor-name)`. See the [Custom Receiver Guide](streaming-custom-receivers.html#implementing-and-using-a-custom-actor-based-receiver) for more details.
- **Queue of RDDs as a Stream:** For testing a Spark Streaming application with test data, one can also create a DStream based on a queue of RDDs, using `streamingContext.queueStream(queueOfRDDs)`. Each RDD pushed into the queue will be treated as a batch of data in the DStream, and processed like a stream.