diff options
author | Kay Ousterhout <kayousterhout@gmail.com> | 2017-02-23 13:27:47 -0800 |
---|---|---|
committer | Kay Ousterhout <kayousterhout@gmail.com> | 2017-02-23 13:27:47 -0800 |
commit | f87a6a59af5037c28d8b3c801c586b347f0ae10c (patch) | |
tree | 525b6be651d6655e5dee33223d42c941f8fc0d10 /docs | |
parent | 4fa4cf1d4ce51ce61e535cfad57385cb5c23b96d (diff) | |
download | spark-f87a6a59af5037c28d8b3c801c586b347f0ae10c.tar.gz spark-f87a6a59af5037c28d8b3c801c586b347f0ae10c.tar.bz2 spark-f87a6a59af5037c28d8b3c801c586b347f0ae10c.zip |
[SPARK-19684][DOCS] Remove developer info from docs.
This commit moves developer-specific information from the release-
specific documentation in this repo to the developer tools page on
the main Spark website. This commit relies on this PR on the
Spark website: https://github.com/apache/spark-website/pull/33.
srowen
Author: Kay Ousterhout <kayousterhout@gmail.com>
Closes #17018 from kayousterhout/SPARK-19684.
Diffstat (limited to 'docs')
-rw-r--r-- | docs/building-spark.md | 43 |
1 files changed, 11 insertions, 32 deletions
diff --git a/docs/building-spark.md b/docs/building-spark.md index 56b892696e..8353b7a520 100644 --- a/docs/building-spark.md +++ b/docs/building-spark.md @@ -132,20 +132,6 @@ Thus, the full flow for running continuous-compilation of the `core` submodule m $ cd core $ ../build/mvn scala:cc -## Speeding up Compilation with Zinc - -[Zinc](https://github.com/typesafehub/zinc) is a long-running server version of SBT's incremental -compiler. When run locally as a background process, it speeds up builds of Scala-based projects -like Spark. Developers who regularly recompile Spark with Maven will be the most interested in -Zinc. The project site gives instructions for building and running `zinc`; OS X users can -install it using `brew install zinc`. - -If using the `build/mvn` package `zinc` will automatically be downloaded and leveraged for all -builds. This process will auto-start after the first time `build/mvn` is called and bind to port -3030 unless the `ZINC_PORT` environment variable is set. The `zinc` process can subsequently be -shut down at any time by running `build/zinc-<version>/bin/zinc -shutdown` and will automatically -restart whenever `build/mvn` is called. - ## Building with SBT Maven is the official build tool recommended for packaging Spark, and is the *build of reference*. @@ -159,8 +145,14 @@ can be set to control the SBT build. For example: To avoid the overhead of launching sbt each time you need to re-compile, you can launch sbt in interactive mode by running `build/sbt`, and then run all build commands at the command -prompt. For more recommendations on reducing build time, refer to the -[Useful Developer Tools page](http://spark.apache.org/developer-tools.html). +prompt. + +## Speeding up Compilation + +Developers who compile Spark frequently may want to speed up compilation; e.g., by using Zinc +(for developers who build with Maven) or by avoiding re-compilation of the assembly JAR (for +developers who build with SBT). For more information about how to do this, refer to the +[Useful Developer Tools page](http://spark.apache.org/developer-tools.html#reducing-build-times). ##Â Encrypted Filesystems @@ -190,29 +182,16 @@ The following is an example of a command to run the tests: ./build/mvn test -The ScalaTest plugin also supports running only a specific Scala test suite as follows: - - ./build/mvn -P... -Dtest=none -DwildcardSuites=org.apache.spark.repl.ReplSuite test - ./build/mvn -P... -Dtest=none -DwildcardSuites=org.apache.spark.repl.* test - -or a Java test: - - ./build/mvn test -P... -DwildcardSuites=none -Dtest=org.apache.spark.streaming.JavaAPISuite - ## Testing with SBT The following is an example of a command to run the tests: ./build/sbt test -To run only a specific test suite as follows: - - ./build/sbt "test-only org.apache.spark.repl.ReplSuite" - ./build/sbt "test-only org.apache.spark.repl.*" - -To run test suites of a specific sub project as follows: +## Running Individual Tests - ./build/sbt core/test +For information about how to run individual tests, refer to the +[Useful Developer Tools page](http://spark.apache.org/developer-tools.html#running-individual-tests). ## PySpark pip installable |