diff options
Diffstat (limited to 'docs/building-spark.md')
-rw-r--r-- | docs/building-spark.md | 24 |
1 files changed, 14 insertions, 10 deletions
diff --git a/docs/building-spark.md b/docs/building-spark.md index 2c987cf834..6908fc1ba7 100644 --- a/docs/building-spark.md +++ b/docs/building-spark.md @@ -50,7 +50,7 @@ To create a Spark distribution like those distributed by the to be runnable, use `./dev/make-distribution.sh` in the project root directory. It can be configured with Maven profile settings and so on like the direct Maven build. Example: - ./dev/make-distribution.sh --name custom-spark --tgz -Psparkr -Phadoop-2.4 -Phive -Phive-thriftserver -Pyarn + ./dev/make-distribution.sh --name custom-spark --tgz -Psparkr -Phadoop-2.4 -Phive -Phive-thriftserver -Pmesos -Pyarn For more information on usage, run `./dev/make-distribution.sh --help` @@ -105,13 +105,17 @@ By default Spark will build with Hive 1.2.1 bindings. ## Packaging without Hadoop Dependencies for YARN -The assembly directory produced by `mvn package` will, by default, include all of Spark's -dependencies, including Hadoop and some of its ecosystem projects. On YARN deployments, this -causes multiple versions of these to appear on executor classpaths: the version packaged in +The assembly directory produced by `mvn package` will, by default, include all of Spark's +dependencies, including Hadoop and some of its ecosystem projects. On YARN deployments, this +causes multiple versions of these to appear on executor classpaths: the version packaged in the Spark assembly and the version on each node, included with `yarn.application.classpath`. -The `hadoop-provided` profile builds the assembly without including Hadoop-ecosystem projects, +The `hadoop-provided` profile builds the assembly without including Hadoop-ecosystem projects, like ZooKeeper and Hadoop itself. +## Building with Mesos support + + ./build/mvn -Pmesos -DskipTests clean package + ## Building for Scala 2.10 To produce a Spark package compiled with Scala 2.10, use the `-Dscala-2.10` property: @@ -263,17 +267,17 @@ The run-tests script also can be limited to a specific Python version or a speci ## Running R Tests -To run the SparkR tests you will need to install the R package `testthat` -(run `install.packages(testthat)` from R shell). You can run just the SparkR tests using +To run the SparkR tests you will need to install the R package `testthat` +(run `install.packages(testthat)` from R shell). You can run just the SparkR tests using the command: ./R/run-tests.sh ## Running Docker-based Integration Test Suites -In order to run Docker integration tests, you have to install the `docker` engine on your box. -The instructions for installation can be found at [the Docker site](https://docs.docker.com/engine/installation/). -Once installed, the `docker` service needs to be started, if not already running. +In order to run Docker integration tests, you have to install the `docker` engine on your box. +The instructions for installation can be found at [the Docker site](https://docs.docker.com/engine/installation/). +Once installed, the `docker` service needs to be started, if not already running. On Linux, this can be done by `sudo service docker start`. ./build/mvn install -DskipTests |