aboutsummaryrefslogtreecommitdiff
path: root/docs/spark-standalone.md
diff options
context:
space:
mode:
authorAndrew Or <andrewor14@gmail.com>2014-06-27 16:11:31 -0700
committerPatrick Wendell <pwendell@gmail.com>2014-06-27 16:11:31 -0700
commitf17510e371dfbeaada3c72b884d70c36503ea30a (patch)
tree2a134954b34cdb3a1bf9b3e8dd7d251e9ccef28f /docs/spark-standalone.md
parent21e0f77b6321590ed86223a60cdb8ae08ea4057f (diff)
downloadspark-f17510e371dfbeaada3c72b884d70c36503ea30a.tar.gz
spark-f17510e371dfbeaada3c72b884d70c36503ea30a.tar.bz2
spark-f17510e371dfbeaada3c72b884d70c36503ea30a.zip
[SPARK-2259] Fix highly misleading docs on cluster / client deploy modes
The existing docs are highly misleading. For standalone mode, for example, it encourages the user to use standalone-cluster mode, which is not officially supported. The safeguards have been added in Spark submit itself to prevent bad documentation from leading users down the wrong path in the future. This PR is prompted by countless headaches users of Spark have run into on the mailing list. Author: Andrew Or <andrewor14@gmail.com> Closes #1200 from andrewor14/submit-docs and squashes the following commits: 5ea2460 [Andrew Or] Rephrase cluster vs client explanation c827f32 [Andrew Or] Clarify spark submit messages 9f7ed8f [Andrew Or] Clarify client vs cluster deploy mode + add safeguards
Diffstat (limited to 'docs/spark-standalone.md')
-rw-r--r--docs/spark-standalone.md9
1 files changed, 4 insertions, 5 deletions
diff --git a/docs/spark-standalone.md b/docs/spark-standalone.md
index 3c1ce06083..f5c0f7cef8 100644
--- a/docs/spark-standalone.md
+++ b/docs/spark-standalone.md
@@ -235,11 +235,10 @@ You can also pass an option `--cores <numCores>` to control the number of cores
# Launching Compiled Spark Applications
-Spark supports two deploy modes: applications may run with the driver inside the client process or
-entirely inside the cluster. The
-[`spark-submit` script](submitting-applications.html) provides the
-most straightforward way to submit a compiled Spark application to the cluster in either deploy
-mode.
+The [`spark-submit` script](submitting-applications.html) provides the most straightforward way to
+submit a compiled Spark application to the cluster. For standalone clusters, Spark currently
+only supports deploying the driver inside the client process that is submitting the application
+(`client` deploy mode).
If your application is launched through Spark submit, then the application jar is automatically
distributed to all worker nodes. For any additional jars that your application depends on, you