diff options
author | Matei Zaharia <matei@databricks.com> | 2014-05-30 00:34:33 -0700 |
---|---|---|
committer | Patrick Wendell <pwendell@gmail.com> | 2014-05-30 00:34:33 -0700 |
commit | c8bf4131bc2a2e147e977159fc90e94b85738830 (patch) | |
tree | a2f885df8fb6654bd7750bb344b97a6cb6889bf3 /docs/running-on-mesos.md | |
parent | eeee978a348ec2a35cc27865cea6357f9db75b74 (diff) | |
download | spark-c8bf4131bc2a2e147e977159fc90e94b85738830.tar.gz spark-c8bf4131bc2a2e147e977159fc90e94b85738830.tar.bz2 spark-c8bf4131bc2a2e147e977159fc90e94b85738830.zip |
[SPARK-1566] consolidate programming guide, and general doc updates
This is a fairly large PR to clean up and update the docs for 1.0. The major changes are:
* A unified programming guide for all languages replaces language-specific ones and shows language-specific info in tabs
* New programming guide sections on key-value pairs, unit testing, input formats beyond text, migrating from 0.9, and passing functions to Spark
* Spark-submit guide moved to a separate page and expanded slightly
* Various cleanups of the menu system, security docs, and others
* Updated look of title bar to differentiate the docs from previous Spark versions
You can find the updated docs at http://people.apache.org/~matei/1.0-docs/_site/ and in particular http://people.apache.org/~matei/1.0-docs/_site/programming-guide.html.
Author: Matei Zaharia <matei@databricks.com>
Closes #896 from mateiz/1.0-docs and squashes the following commits:
03e6853 [Matei Zaharia] Some tweaks to configuration and YARN docs
0779508 [Matei Zaharia] tweak
ef671d4 [Matei Zaharia] Keep frames in JavaDoc links, and other small tweaks
1bf4112 [Matei Zaharia] Review comments
4414f88 [Matei Zaharia] tweaks
d04e979 [Matei Zaharia] Fix some old links to Java guide
a34ed33 [Matei Zaharia] tweak
541bb3b [Matei Zaharia] miscellaneous changes
fcefdec [Matei Zaharia] Moved submitting apps to separate doc
61d72b4 [Matei Zaharia] stuff
181f217 [Matei Zaharia] migration guide, remove old language guides
e11a0da [Matei Zaharia] Add more API functions
6a030a9 [Matei Zaharia] tweaks
8db0ae3 [Matei Zaharia] Added key-value pairs section
318d2c9 [Matei Zaharia] tweaks
1c81477 [Matei Zaharia] New section on basics and function syntax
e38f559 [Matei Zaharia] Actually added programming guide to Git
a33d6fe [Matei Zaharia] First pass at updating programming guide to support all languages, plus other tweaks throughout
3b6a876 [Matei Zaharia] More CSS tweaks
01ec8bf [Matei Zaharia] More CSS tweaks
e6d252e [Matei Zaharia] Change color of doc title bar to differentiate from 0.9.0
Diffstat (limited to 'docs/running-on-mesos.md')
-rw-r--r-- | docs/running-on-mesos.md | 7 |
1 files changed, 5 insertions, 2 deletions
diff --git a/docs/running-on-mesos.md b/docs/running-on-mesos.md index df8687f81f..e3c8922404 100644 --- a/docs/running-on-mesos.md +++ b/docs/running-on-mesos.md @@ -103,7 +103,7 @@ the `make-distribution.sh` script included in a Spark source tarball/checkout. ## Using a Mesos Master URL The Master URLs for Mesos are in the form `mesos://host:5050` for a single-master Mesos -cluster, or `zk://host:2181` for a multi-master Mesos cluster using ZooKeeper. +cluster, or `mesos://zk://host:2181` for a multi-master Mesos cluster using ZooKeeper. The driver also needs some configuration in `spark-env.sh` to interact properly with Mesos: @@ -116,7 +116,7 @@ The driver also needs some configuration in `spark-env.sh` to interact properly 2. Also set `spark.executor.uri` to `<URL of spark-{{site.SPARK_VERSION}}.tar.gz>`. Now when starting a Spark application against the cluster, pass a `mesos://` -or `zk://` URL as the master when creating a `SparkContext`. For example: +URL as the master when creating a `SparkContext`. For example: {% highlight scala %} val conf = new SparkConf() @@ -126,6 +126,9 @@ val conf = new SparkConf() val sc = new SparkContext(conf) {% endhighlight %} +(You can also use [`spark-submit`](submitting-applications.html) and configure `spark.executor.uri` +in the [conf/spark-defaults.conf](configuration.html#loading-default-configurations) file.) + When running a shell, the `spark.executor.uri` parameter is inherited from `SPARK_EXECUTOR_URI`, so it does not need to be redundantly passed in as a system property. |