aboutsummaryrefslogtreecommitdiff
path: root/docs/running-on-mesos.md
diff options
context:
space:
mode:
authorMatei Zaharia <matei@eecs.berkeley.edu>2013-08-31 14:21:10 -0700
committerMatei Zaharia <matei@eecs.berkeley.edu>2013-08-31 14:21:10 -0700
commit4819baa658a6c8a3e4c5c504af284ea6091e4c35 (patch)
tree00eda629ac7292487ef14f858d19297c38a19607 /docs/running-on-mesos.md
parent4293533032bd5c354bb011f8d508b99615c6e0f0 (diff)
downloadspark-4819baa658a6c8a3e4c5c504af284ea6091e4c35.tar.gz
spark-4819baa658a6c8a3e4c5c504af284ea6091e4c35.tar.bz2
spark-4819baa658a6c8a3e4c5c504af284ea6091e4c35.zip
More updates, describing changes to recommended use of environment vars
and new Python stuff
Diffstat (limited to 'docs/running-on-mesos.md')
-rw-r--r--docs/running-on-mesos.md3
1 files changed, 1 insertions, 2 deletions
diff --git a/docs/running-on-mesos.md b/docs/running-on-mesos.md
index f4a3eb667c..b31f78e8bf 100644
--- a/docs/running-on-mesos.md
+++ b/docs/running-on-mesos.md
@@ -9,9 +9,8 @@ Spark can run on private clusters managed by the [Apache Mesos](http://incubator
2. Download Mesos {{site.MESOS_VERSION}} from a [mirror](http://www.apache.org/dyn/closer.cgi/incubator/mesos/mesos-{{site.MESOS_VERSION}}/).
3. Configure Mesos using the `configure` script, passing the location of your `JAVA_HOME` using `--with-java-home`. Mesos comes with "template" configure scripts for different platforms, such as `configure.macosx`, that you can run. See the README file in Mesos for other options. **Note:** If you want to run Mesos without installing it into the default paths on your system (e.g. if you don't have administrative privileges to install it), you should also pass the `--prefix` option to `configure` to tell it where to install. For example, pass `--prefix=/home/user/mesos`. By default the prefix is `/usr/local`.
4. Build Mesos using `make`, and then install it using `make install`.
-5. Create a file called `spark-env.sh` in Spark's `conf` directory, by copying `conf/spark-env.sh.template`, and add the following lines in it:
+5. Create a file called `spark-env.sh` in Spark's `conf` directory, by copying `conf/spark-env.sh.template`, and add the following lines it:
* `export MESOS_NATIVE_LIBRARY=<path to libmesos.so>`. This path is usually `<prefix>/lib/libmesos.so` (where the prefix is `/usr/local` by default). Also, on Mac OS X, the library is called `libmesos.dylib` instead of `.so`.
- * `export SCALA_HOME=<path to Scala directory>`.
6. Copy Spark and Mesos to the _same_ paths on all the nodes in the cluster (or, for Mesos, `make install` on every node).
7. Configure Mesos for deployment:
* On your master node, edit `<prefix>/var/mesos/deploy/masters` to list your master and `<prefix>/var/mesos/deploy/slaves` to list the slaves, where `<prefix>` is the prefix where you installed Mesos (`/usr/local` by default).