From 60150cf00a70e684d2cad864ab055ad53106938b Mon Sep 17 00:00:00 2001 From: Jean-Baptiste Onofré Date: Thu, 8 Oct 2015 11:38:39 +0100 Subject: [SPARK-10883] Add a note about how to build Spark sub-modules (reactor) MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Author: Jean-Baptiste Onofré Closes #8993 from jbonofre/SPARK-10883-2. --- docs/building-spark.md | 11 +++++++++++ 1 file changed, 11 insertions(+) diff --git a/docs/building-spark.md b/docs/building-spark.md index 4db32cfd62..4d929ee10a 100644 --- a/docs/building-spark.md +++ b/docs/building-spark.md @@ -144,6 +144,17 @@ The ScalaTest plugin also supports running only a specific test suite as follows mvn -Dhadoop.version=... -DwildcardSuites=org.apache.spark.repl.ReplSuite test +# Building submodules individually + +It's possible to build Spark sub-modules using the `mvn -pl` option. + +For instance, you can build the Spark Streaming module using: + +{% highlight bash %} +mvn -pl :spark-streaming_2.10 clean install +{% endhighlight %} + +where `spark-streaming_2.10` is the `artifactId` as defined in `streaming/pom.xml` file. # Continuous Compilation -- cgit v1.2.3