aboutsummaryrefslogtreecommitdiff
path: root/docs/running-on-mesos.md
diff options
context:
space:
mode:
authorAndrew Or <andrewor14@gmail.com>2014-06-27 16:11:31 -0700
committerPatrick Wendell <pwendell@gmail.com>2014-06-27 16:11:31 -0700
commitf17510e371dfbeaada3c72b884d70c36503ea30a (patch)
tree2a134954b34cdb3a1bf9b3e8dd7d251e9ccef28f /docs/running-on-mesos.md
parent21e0f77b6321590ed86223a60cdb8ae08ea4057f (diff)
downloadspark-f17510e371dfbeaada3c72b884d70c36503ea30a.tar.gz
spark-f17510e371dfbeaada3c72b884d70c36503ea30a.tar.bz2
spark-f17510e371dfbeaada3c72b884d70c36503ea30a.zip
[SPARK-2259] Fix highly misleading docs on cluster / client deploy modes
The existing docs are highly misleading. For standalone mode, for example, it encourages the user to use standalone-cluster mode, which is not officially supported. The safeguards have been added in Spark submit itself to prevent bad documentation from leading users down the wrong path in the future. This PR is prompted by countless headaches users of Spark have run into on the mailing list. Author: Andrew Or <andrewor14@gmail.com> Closes #1200 from andrewor14/submit-docs and squashes the following commits: 5ea2460 [Andrew Or] Rephrase cluster vs client explanation c827f32 [Andrew Or] Clarify spark submit messages 9f7ed8f [Andrew Or] Clarify client vs cluster deploy mode + add safeguards
Diffstat (limited to 'docs/running-on-mesos.md')
-rw-r--r--docs/running-on-mesos.md3
1 files changed, 2 insertions, 1 deletions
diff --git a/docs/running-on-mesos.md b/docs/running-on-mesos.md
index e3c8922404..bd046cfc18 100644
--- a/docs/running-on-mesos.md
+++ b/docs/running-on-mesos.md
@@ -127,7 +127,8 @@ val sc = new SparkContext(conf)
{% endhighlight %}
(You can also use [`spark-submit`](submitting-applications.html) and configure `spark.executor.uri`
-in the [conf/spark-defaults.conf](configuration.html#loading-default-configurations) file.)
+in the [conf/spark-defaults.conf](configuration.html#loading-default-configurations) file. Note
+that `spark-submit` currently only supports deploying the Spark driver in `client` mode for Mesos.)
When running a shell, the `spark.executor.uri` parameter is inherited from `SPARK_EXECUTOR_URI`, so
it does not need to be redundantly passed in as a system property.