diff options
author | Matei Zaharia <matei@databricks.com> | 2014-05-30 00:34:33 -0700 |
---|---|---|
committer | Patrick Wendell <pwendell@gmail.com> | 2014-05-30 00:34:53 -0700 |
commit | 80721fb451abff8fafbffb4a6a9c97183502f1e2 (patch) | |
tree | c1fef6375d48114475bfd28e4ae533ab5d02296e /docs/configuration.md | |
parent | 0f56aadc8d1c34463cee2e234b6250145d866cd7 (diff) | |
download | spark-80721fb451abff8fafbffb4a6a9c97183502f1e2.tar.gz spark-80721fb451abff8fafbffb4a6a9c97183502f1e2.tar.bz2 spark-80721fb451abff8fafbffb4a6a9c97183502f1e2.zip |
[SPARK-1566] consolidate programming guide, and general doc updates
This is a fairly large PR to clean up and update the docs for 1.0. The major changes are:
* A unified programming guide for all languages replaces language-specific ones and shows language-specific info in tabs
* New programming guide sections on key-value pairs, unit testing, input formats beyond text, migrating from 0.9, and passing functions to Spark
* Spark-submit guide moved to a separate page and expanded slightly
* Various cleanups of the menu system, security docs, and others
* Updated look of title bar to differentiate the docs from previous Spark versions
You can find the updated docs at http://people.apache.org/~matei/1.0-docs/_site/ and in particular http://people.apache.org/~matei/1.0-docs/_site/programming-guide.html.
Author: Matei Zaharia <matei@databricks.com>
Closes #896 from mateiz/1.0-docs and squashes the following commits:
03e6853 [Matei Zaharia] Some tweaks to configuration and YARN docs
0779508 [Matei Zaharia] tweak
ef671d4 [Matei Zaharia] Keep frames in JavaDoc links, and other small tweaks
1bf4112 [Matei Zaharia] Review comments
4414f88 [Matei Zaharia] tweaks
d04e979 [Matei Zaharia] Fix some old links to Java guide
a34ed33 [Matei Zaharia] tweak
541bb3b [Matei Zaharia] miscellaneous changes
fcefdec [Matei Zaharia] Moved submitting apps to separate doc
61d72b4 [Matei Zaharia] stuff
181f217 [Matei Zaharia] migration guide, remove old language guides
e11a0da [Matei Zaharia] Add more API functions
6a030a9 [Matei Zaharia] tweaks
8db0ae3 [Matei Zaharia] Added key-value pairs section
318d2c9 [Matei Zaharia] tweaks
1c81477 [Matei Zaharia] New section on basics and function syntax
e38f559 [Matei Zaharia] Actually added programming guide to Git
a33d6fe [Matei Zaharia] First pass at updating programming guide to support all languages, plus other tweaks throughout
3b6a876 [Matei Zaharia] More CSS tweaks
01ec8bf [Matei Zaharia] More CSS tweaks
e6d252e [Matei Zaharia] Change color of doc title bar to differentiate from 0.9.0
(cherry picked from commit c8bf4131bc2a2e147e977159fc90e94b85738830)
Signed-off-by: Patrick Wendell <pwendell@gmail.com>
Diffstat (limited to 'docs/configuration.md')
-rw-r--r-- | docs/configuration.md | 11 |
1 files changed, 6 insertions, 5 deletions
diff --git a/docs/configuration.md b/docs/configuration.md index b6e7fd34ea..2fd691800e 100644 --- a/docs/configuration.md +++ b/docs/configuration.md @@ -7,8 +7,8 @@ title: Spark Configuration Spark provides three locations to configure the system: -* [Spark properties](#spark-properties) control most application parameters and can be set by passing - a [SparkConf](api/core/index.html#org.apache.spark.SparkConf) object to SparkContext, or through Java +* [Spark properties](#spark-properties) control most application parameters and can be set by using + a [SparkConf](api/core/index.html#org.apache.spark.SparkConf) object, or through Java system properties. * [Environment variables](#environment-variables) can be used to set per-machine settings, such as the IP address, through the `conf/spark-env.sh` script on each node. @@ -18,8 +18,8 @@ Spark provides three locations to configure the system: Spark properties control most application settings and are configured separately for each application. These properties can be set directly on a -[SparkConf](api/scala/index.html#org.apache.spark.SparkConf) and passed as an argument to your -SparkContext. SparkConf allows you to configure some of the common properties +[SparkConf](api/scala/index.html#org.apache.spark.SparkConf) passed to your +`SparkContext`. `SparkConf` allows you to configure some of the common properties (e.g. master URL and application name), as well as arbitrary key-value pairs through the `set()` method. For example, we could initialize an application as follows: @@ -75,6 +75,7 @@ appear. For all other configuration properties, you can assume the default value Most of the properties that control internal settings have reasonable default values. Some of the most common options to set are: +#### Application Properties <table class="table"> <tr><th>Property Name</th><th>Default</th><th>Meaning</th></tr> <tr> @@ -777,7 +778,7 @@ Apart from these, the following properties are also available, and may be useful </tr> </table> -#### Cluster Managers (YARN, Mesos, Standalone) +#### Cluster Managers Each cluster manager in Spark has additional configuration options. Configurations can be found on the pages for each mode: |