aboutsummaryrefslogtreecommitdiff
path: root/docs/building-spark.md
diff options
context:
space:
mode:
authorMichael Gummelt <mgummelt@mesosphere.io>2016-08-26 12:25:22 -0700
committerMarcelo Vanzin <vanzin@cloudera.com>2016-08-26 12:25:22 -0700
commit8e5475be3c9a620f18f6712631b093464a7d0ee7 (patch)
tree417e25ea8798c0f9313285623a664fe7ac4fc003 /docs/building-spark.md
parentc0949dc944b7e2fc8a4465acc68a8f2713b3fa13 (diff)
downloadspark-8e5475be3c9a620f18f6712631b093464a7d0ee7.tar.gz
spark-8e5475be3c9a620f18f6712631b093464a7d0ee7.tar.bz2
spark-8e5475be3c9a620f18f6712631b093464a7d0ee7.zip
[SPARK-16967] move mesos to module
## What changes were proposed in this pull request? Move Mesos code into a mvn module ## How was this patch tested? unit tests manually submitting a client mode and cluster mode job spark/mesos integration test suite Author: Michael Gummelt <mgummelt@mesosphere.io> Closes #14637 from mgummelt/mesos-module.
Diffstat (limited to 'docs/building-spark.md')
-rw-r--r--docs/building-spark.md24
1 files changed, 14 insertions, 10 deletions
diff --git a/docs/building-spark.md b/docs/building-spark.md
index 2c987cf834..6908fc1ba7 100644
--- a/docs/building-spark.md
+++ b/docs/building-spark.md
@@ -50,7 +50,7 @@ To create a Spark distribution like those distributed by the
to be runnable, use `./dev/make-distribution.sh` in the project root directory. It can be configured
with Maven profile settings and so on like the direct Maven build. Example:
- ./dev/make-distribution.sh --name custom-spark --tgz -Psparkr -Phadoop-2.4 -Phive -Phive-thriftserver -Pyarn
+ ./dev/make-distribution.sh --name custom-spark --tgz -Psparkr -Phadoop-2.4 -Phive -Phive-thriftserver -Pmesos -Pyarn
For more information on usage, run `./dev/make-distribution.sh --help`
@@ -105,13 +105,17 @@ By default Spark will build with Hive 1.2.1 bindings.
## Packaging without Hadoop Dependencies for YARN
-The assembly directory produced by `mvn package` will, by default, include all of Spark's
-dependencies, including Hadoop and some of its ecosystem projects. On YARN deployments, this
-causes multiple versions of these to appear on executor classpaths: the version packaged in
+The assembly directory produced by `mvn package` will, by default, include all of Spark's
+dependencies, including Hadoop and some of its ecosystem projects. On YARN deployments, this
+causes multiple versions of these to appear on executor classpaths: the version packaged in
the Spark assembly and the version on each node, included with `yarn.application.classpath`.
-The `hadoop-provided` profile builds the assembly without including Hadoop-ecosystem projects,
+The `hadoop-provided` profile builds the assembly without including Hadoop-ecosystem projects,
like ZooKeeper and Hadoop itself.
+## Building with Mesos support
+
+ ./build/mvn -Pmesos -DskipTests clean package
+
## Building for Scala 2.10
To produce a Spark package compiled with Scala 2.10, use the `-Dscala-2.10` property:
@@ -263,17 +267,17 @@ The run-tests script also can be limited to a specific Python version or a speci
## Running R Tests
-To run the SparkR tests you will need to install the R package `testthat`
-(run `install.packages(testthat)` from R shell). You can run just the SparkR tests using
+To run the SparkR tests you will need to install the R package `testthat`
+(run `install.packages(testthat)` from R shell). You can run just the SparkR tests using
the command:
./R/run-tests.sh
## Running Docker-based Integration Test Suites
-In order to run Docker integration tests, you have to install the `docker` engine on your box.
-The instructions for installation can be found at [the Docker site](https://docs.docker.com/engine/installation/).
-Once installed, the `docker` service needs to be started, if not already running.
+In order to run Docker integration tests, you have to install the `docker` engine on your box.
+The instructions for installation can be found at [the Docker site](https://docs.docker.com/engine/installation/).
+Once installed, the `docker` service needs to be started, if not already running.
On Linux, this can be done by `sudo service docker start`.
./build/mvn install -DskipTests