diff options
author | Shixiong Zhu <shixiong@databricks.com> | 2016-08-25 21:08:42 -0700 |
---|---|---|
committer | Reynold Xin <rxin@databricks.com> | 2016-08-25 21:08:42 -0700 |
commit | 341e0e778dff8c404b47d34ee7661b658bb91880 (patch) | |
tree | b5c796effe9dc99f20d9ee0970e532fb3431b172 | |
parent | b964a172a8c075486189cc9be09a51b8446f0da4 (diff) | |
download | spark-341e0e778dff8c404b47d34ee7661b658bb91880.tar.gz spark-341e0e778dff8c404b47d34ee7661b658bb91880.tar.bz2 spark-341e0e778dff8c404b47d34ee7661b658bb91880.zip |
[SPARK-17242][DOCUMENT] Update links of external dstream projects
## What changes were proposed in this pull request?
Updated links of external dstream projects.
## How was this patch tested?
Just document changes.
Author: Shixiong Zhu <shixiong@databricks.com>
Closes #14814 from zsxwing/dstream-link.
-rw-r--r-- | docs/streaming-programming-guide.md | 8 |
1 files changed, 2 insertions, 6 deletions
diff --git a/docs/streaming-programming-guide.md b/docs/streaming-programming-guide.md index df94e9533e..82d36474ff 100644 --- a/docs/streaming-programming-guide.md +++ b/docs/streaming-programming-guide.md @@ -656,7 +656,7 @@ methods for creating DStreams from files as input sources. <span class="badge" style="background-color: grey">Python API</span> `fileStream` is not available in the Python API, only `textFileStream` is available. - **Streams based on Custom Receivers:** DStreams can be created with data streams received through custom receivers. See the [Custom Receiver - Guide](streaming-custom-receivers.html) and [DStream Akka](https://github.com/spark-packages/dstream-akka) for more details. + Guide](streaming-custom-receivers.html) for more details. - **Queue of RDDs as a Stream:** For testing a Spark Streaming application with test data, one can also create a DStream based on a queue of RDDs, using `streamingContext.queueStream(queueOfRDDs)`. Each RDD pushed into the queue will be treated as a batch of data in the DStream, and processed like a stream. @@ -2383,11 +2383,7 @@ additional effort may be necessary to achieve exactly-once semantics. There are - [Kafka Integration Guide](streaming-kafka-integration.html) - [Kinesis Integration Guide](streaming-kinesis-integration.html) - [Custom Receiver Guide](streaming-custom-receivers.html) -* External DStream data sources: - - [DStream MQTT](https://github.com/spark-packages/dstream-mqtt) - - [DStream Twitter](https://github.com/spark-packages/dstream-twitter) - - [DStream Akka](https://github.com/spark-packages/dstream-akka) - - [DStream ZeroMQ](https://github.com/spark-packages/dstream-zeromq) +* Third-party DStream data sources can be found in [Spark Packages](https://spark-packages.org/) * API documentation - Scala docs * [StreamingContext](api/scala/index.html#org.apache.spark.streaming.StreamingContext) and |