aboutsummaryrefslogtreecommitdiff
path: root/docs/configuration.md
diff options
context:
space:
mode:
authorSandy Ryza <sandy@cloudera.com>2014-07-23 23:09:25 -0700
committerPatrick Wendell <pwendell@gmail.com>2014-07-23 23:11:26 -0700
commite34922a221738bae1195d8ace90369c9ddc3a48d (patch)
tree644023570fd785b835094002cad336aa2bfe0733 /docs/configuration.md
parent78d18fdbaa62d8ed235c29b2e37fd6607263c639 (diff)
downloadspark-e34922a221738bae1195d8ace90369c9ddc3a48d.tar.gz
spark-e34922a221738bae1195d8ace90369c9ddc3a48d.tar.bz2
spark-e34922a221738bae1195d8ace90369c9ddc3a48d.zip
SPARK-2310. Support arbitrary Spark properties on the command line with ...
...spark-submit The PR allows invocations like spark-submit --class org.MyClass --spark.shuffle.spill false myjar.jar Author: Sandy Ryza <sandy@cloudera.com> Closes #1253 from sryza/sandy-spark-2310 and squashes the following commits: 1dc9855 [Sandy Ryza] More doc and cleanup 00edfb9 [Sandy Ryza] Review comments 91b244a [Sandy Ryza] Change format to --conf PROP=VALUE 8fabe77 [Sandy Ryza] SPARK-2310. Support arbitrary Spark properties on the command line with spark-submit
Diffstat (limited to 'docs/configuration.md')
-rw-r--r--docs/configuration.md8
1 files changed, 5 insertions, 3 deletions
diff --git a/docs/configuration.md b/docs/configuration.md
index 02af461267..cb0c65e2d2 100644
--- a/docs/configuration.md
+++ b/docs/configuration.md
@@ -42,13 +42,15 @@ val sc = new SparkContext(new SparkConf())
Then, you can supply configuration values at runtime:
{% highlight bash %}
-./bin/spark-submit --name "My fancy app" --master local[4] myApp.jar
+./bin/spark-submit --name "My app" --master local[4] --conf spark.shuffle.spill=false
+ --conf "spark.executor.extraJavaOptions=-XX:+PrintGCDetails -XX:+PrintGCTimeStamps" myApp.jar
{% endhighlight %}
The Spark shell and [`spark-submit`](cluster-overview.html#launching-applications-with-spark-submit)
tool support two ways to load configurations dynamically. The first are command line options,
-such as `--master`, as shown above. Running `./bin/spark-submit --help` will show the entire list
-of options.
+such as `--master`, as shown above. `spark-submit` can accept any Spark property using the `--conf`
+flag, but uses special flags for properties that play a part in launching the Spark application.
+Running `./bin/spark-submit --help` will show the entire list of these options.
`bin/spark-submit` will also read configuration options from `conf/spark-defaults.conf`, in which
each line consists of a key and a value separated by whitespace. For example: