aboutsummaryrefslogtreecommitdiff
path: root/docs/configuration.md
diff options
context:
space:
mode:
authorjay@apache.org <jayunit100>2014-11-05 15:45:34 -0800
committerMatei Zaharia <matei@databricks.com>2014-11-05 15:45:34 -0800
commit868cd4c3ca11e6ecc4425b972d9a20c360b52425 (patch)
treeaf724bbb6937eb6bd35785b41a794065ff71a0aa /docs/configuration.md
parent61a5cced049a8056292ba94f23fa7bd040f50685 (diff)
downloadspark-868cd4c3ca11e6ecc4425b972d9a20c360b52425.tar.gz
spark-868cd4c3ca11e6ecc4425b972d9a20c360b52425.tar.bz2
spark-868cd4c3ca11e6ecc4425b972d9a20c360b52425.zip
SPARK-4040. Update documentation to exemplify use of local (n) value, fo...
This is a minor docs update which helps to clarify the way local[n] is used for streaming apps. Author: jay@apache.org <jayunit100> Closes #2964 from jayunit100/SPARK-4040 and squashes the following commits: 35b5a5e [jay@apache.org] SPARK-4040: Update documentation to exemplify use of local (n) value.
Diffstat (limited to 'docs/configuration.md')
-rw-r--r--docs/configuration.md10
1 files changed, 8 insertions, 2 deletions
diff --git a/docs/configuration.md b/docs/configuration.md
index 685101ea5c..0f9eb81f6e 100644
--- a/docs/configuration.md
+++ b/docs/configuration.md
@@ -21,16 +21,22 @@ application. These properties can be set directly on a
[SparkConf](api/scala/index.html#org.apache.spark.SparkConf) passed to your
`SparkContext`. `SparkConf` allows you to configure some of the common properties
(e.g. master URL and application name), as well as arbitrary key-value pairs through the
-`set()` method. For example, we could initialize an application as follows:
+`set()` method. For example, we could initialize an application with two threads as follows:
+
+Note that we run with local[2], meaning two threads - which represents "minimal" parallelism,
+which can help detect bugs that only exist when we run in a distributed context.
{% highlight scala %}
val conf = new SparkConf()
- .setMaster("local")
+ .setMaster("local[2]")
.setAppName("CountingSheep")
.set("spark.executor.memory", "1g")
val sc = new SparkContext(conf)
{% endhighlight %}
+Note that we can have more than 1 thread in local mode, and in cases like spark streaming, we may actually
+require one to prevent any sort of starvation issues.
+
## Dynamically Loading Spark Properties
In some cases, you may want to avoid hard-coding certain configurations in a `SparkConf`. For
instance, if you'd like to run the same application with different masters or different