diff options
author | Shivansh <shiv4nsh@gmail.com> | 2016-08-07 09:30:18 +0100 |
---|---|---|
committer | Sean Owen <sowen@cloudera.com> | 2016-08-07 09:30:18 +0100 |
commit | 6c1ecb191bc086290e33d56b6a5706d962e84a3a (patch) | |
tree | a6989ff64c58860f9c99fa2f00bae096691cff8c /docs/programming-guide.md | |
parent | 1275f646964d2fdb5b96a9429760b4fac4340521 (diff) | |
download | spark-6c1ecb191bc086290e33d56b6a5706d962e84a3a.tar.gz spark-6c1ecb191bc086290e33d56b6a5706d962e84a3a.tar.bz2 spark-6c1ecb191bc086290e33d56b6a5706d962e84a3a.zip |
[SPARK-16911] Fix the links in the programming guide
## What changes were proposed in this pull request?
Fix the broken links in the programming guide of the Graphx Migration and understanding closures
## How was this patch tested?
By running the test cases and checking the links.
Author: Shivansh <shiv4nsh@gmail.com>
Closes #14503 from shiv4nsh/SPARK-16911.
Diffstat (limited to 'docs/programming-guide.md')
-rw-r--r-- | docs/programming-guide.md | 45 |
1 files changed, 1 insertions, 44 deletions
diff --git a/docs/programming-guide.md b/docs/programming-guide.md index 5fcd4d3647..f82832905e 100644 --- a/docs/programming-guide.md +++ b/docs/programming-guide.md @@ -1097,7 +1097,7 @@ for details. <tr> <td> <b>foreach</b>(<i>func</i>) </td> <td> Run a function <i>func</i> on each element of the dataset. This is usually done for side effects such as updating an <a href="#accumulators">Accumulator</a> or interacting with external storage systems. - <br /><b>Note</b>: modifying variables other than Accumulators outside of the <code>foreach()</code> may result in undefined behavior. See <a href="#ClosuresLink">Understanding closures </a> for more details.</td> + <br /><b>Note</b>: modifying variables other than Accumulators outside of the <code>foreach()</code> may result in undefined behavior. See <a href="#understanding-closures-a-nameclosureslinka">Understanding closures </a> for more details.</td> </tr> </table> @@ -1544,49 +1544,6 @@ and then call `SparkContext.stop()` to tear it down. Make sure you stop the context within a `finally` block or the test framework's `tearDown` method, as Spark does not support two contexts running concurrently in the same program. -# Migrating from pre-1.0 Versions of Spark - -<div class="codetabs"> - -<div data-lang="scala" markdown="1"> - -Spark 1.0 freezes the API of Spark Core for the 1.X series, in that any API available today that is -not marked "experimental" or "developer API" will be supported in future versions. -The only change for Scala users is that the grouping operations, e.g. `groupByKey`, `cogroup` and `join`, -have changed from returning `(Key, Seq[Value])` pairs to `(Key, Iterable[Value])`. - -</div> - -<div data-lang="java" markdown="1"> - -Spark 1.0 freezes the API of Spark Core for the 1.X series, in that any API available today that is -not marked "experimental" or "developer API" will be supported in future versions. -Several changes were made to the Java API: - -* The Function classes in `org.apache.spark.api.java.function` became interfaces in 1.0, meaning that old - code that `extends Function` should `implement Function` instead. -* New variants of the `map` transformations, like `mapToPair` and `mapToDouble`, were added to create RDDs - of special data types. -* Grouping operations like `groupByKey`, `cogroup` and `join` have changed from returning - `(Key, List<Value>)` pairs to `(Key, Iterable<Value>)`. - -</div> - -<div data-lang="python" markdown="1"> - -Spark 1.0 freezes the API of Spark Core for the 1.X series, in that any API available today that is -not marked "experimental" or "developer API" will be supported in future versions. -The only change for Python users is that the grouping operations, e.g. `groupByKey`, `cogroup` and `join`, -have changed from returning (key, list of values) pairs to (key, iterable of values). - -</div> - -</div> - -Migration guides are also available for [Spark Streaming](streaming-programming-guide.html#migration-guide-from-091-or-below-to-1x), -[MLlib](ml-guide.html#migration-guide) and [GraphX](graphx-programming-guide.html#migrating-from-spark-091). - - # Where to Go from Here You can see some [example Spark programs](http://spark.apache.org/examples.html) on the Spark website. |