aboutsummaryrefslogtreecommitdiff
path: root/docs
diff options
context:
space:
mode:
authorTathagata Das <tathagata.das1565@gmail.com>2014-01-14 00:03:46 -0800
committerTathagata Das <tathagata.das1565@gmail.com>2014-01-14 00:03:46 -0800
commitf8bd828c7ccf1ff69bc35bf95d07183cb35a7c72 (patch)
tree6b4e4307be6b8959d5c7b3785a5b4b380d435edb /docs
parentf8e239e058953e8db88e784439cfd9eca446e606 (diff)
downloadspark-f8bd828c7ccf1ff69bc35bf95d07183cb35a7c72.tar.gz
spark-f8bd828c7ccf1ff69bc35bf95d07183cb35a7c72.tar.bz2
spark-f8bd828c7ccf1ff69bc35bf95d07183cb35a7c72.zip
Fixed loose ends in docs.
Diffstat (limited to 'docs')
-rw-r--r--docs/streaming-programming-guide.md4
1 files changed, 2 insertions, 2 deletions
diff --git a/docs/streaming-programming-guide.md b/docs/streaming-programming-guide.md
index 1495af2267..07c4c55633 100644
--- a/docs/streaming-programming-guide.md
+++ b/docs/streaming-programming-guide.md
@@ -48,10 +48,10 @@ ssc.textFileStream(directory) // Creates a stream that monitors and processes
ssc.socketStream(hostname, port) // Creates a stream that uses a TCP socket to read data from hostname:port
{% endhighlight %}
-The core Spark Streaming API provides input streams for files, sockets, Akka actors. Additional functionality for Kafka, Flume, ZeroMQ, Twitter, etc. can be imported by adding the right dependencies as explained in the [linking](#linking-with-spark-streaming) section.
+The core Spark Streaming API provides input streams for files, sockets, and Akka actors. Additional functionality for Kafka, Flume, ZeroMQ, Twitter, etc. can be imported by adding the right dependencies as explained in the [linking](#linking-with-spark-streaming) section.
# DStream Operations
-Data received from the input streams can be processed using _DStream operations_. There are two kinds of operations - _transformations_ and _output operations_. Similar to RDD transformations, DStream transformations operate on one or more DStreams to create new DStreams with transformed data. After applying a sequence of transformations to the input streams, output operations need to called, which writes data out to an external data sink like a file system or a database.
+Data received from the input streams can be processed using _DStream operations_. There are two kinds of operations - _transformations_ and _output operations_. Similar to RDD transformations, DStream transformations operate on one or more DStreams to create new DStreams with transformed data. After applying a sequence of transformations to the input streams, output operations need to called, which write data out to an external data sink like a file system or a database.
## Transformations