From f17510e371dfbeaada3c72b884d70c36503ea30a Mon Sep 17 00:00:00 2001 From: Andrew Or Date: Fri, 27 Jun 2014 16:11:31 -0700 Subject: [SPARK-2259] Fix highly misleading docs on cluster / client deploy modes The existing docs are highly misleading. For standalone mode, for example, it encourages the user to use standalone-cluster mode, which is not officially supported. The safeguards have been added in Spark submit itself to prevent bad documentation from leading users down the wrong path in the future. This PR is prompted by countless headaches users of Spark have run into on the mailing list. Author: Andrew Or Closes #1200 from andrewor14/submit-docs and squashes the following commits: 5ea2460 [Andrew Or] Rephrase cluster vs client explanation c827f32 [Andrew Or] Clarify spark submit messages 9f7ed8f [Andrew Or] Clarify client vs cluster deploy mode + add safeguards --- docs/running-on-mesos.md | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) (limited to 'docs/running-on-mesos.md') diff --git a/docs/running-on-mesos.md b/docs/running-on-mesos.md index e3c8922404..bd046cfc18 100644 --- a/docs/running-on-mesos.md +++ b/docs/running-on-mesos.md @@ -127,7 +127,8 @@ val sc = new SparkContext(conf) {% endhighlight %} (You can also use [`spark-submit`](submitting-applications.html) and configure `spark.executor.uri` -in the [conf/spark-defaults.conf](configuration.html#loading-default-configurations) file.) +in the [conf/spark-defaults.conf](configuration.html#loading-default-configurations) file. Note +that `spark-submit` currently only supports deploying the Spark driver in `client` mode for Mesos.) When running a shell, the `spark.executor.uri` parameter is inherited from `SPARK_EXECUTOR_URI`, so it does not need to be redundantly passed in as a system property. -- cgit v1.2.3