aboutsummaryrefslogtreecommitdiff
path: root/docs
diff options
context:
space:
mode:
authorwangfei <wangfei1@huawei.com>2014-11-02 22:02:05 -0800
committerPatrick Wendell <pwendell@gmail.com>2014-11-02 22:02:05 -0800
commit001acc446345ccb1e494af9ff1d16dd65db8034e (patch)
tree6d1eabaa20fd9de5529d0558f98bbf21bead45e9 /docs
parentd6e4c5917522b9fb6653ddc0634e93ff2dcf82be (diff)
downloadspark-001acc446345ccb1e494af9ff1d16dd65db8034e.tar.gz
spark-001acc446345ccb1e494af9ff1d16dd65db8034e.tar.bz2
spark-001acc446345ccb1e494af9ff1d16dd65db8034e.zip
[SPARK-4177][Doc]update build doc since JDBC/CLI support hive 13 now
Author: wangfei <wangfei1@huawei.com> Closes #3042 from scwf/patch-9 and squashes the following commits: 3784ed1 [wangfei] remove 'TODO' 1891553 [wangfei] update build doc since JDBC/CLI support hive 13
Diffstat (limited to 'docs')
-rw-r--r--docs/building-spark.md17
1 files changed, 7 insertions, 10 deletions
diff --git a/docs/building-spark.md b/docs/building-spark.md
index 4cc0b1f2e5..238ddae155 100644
--- a/docs/building-spark.md
+++ b/docs/building-spark.md
@@ -99,14 +99,11 @@ mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0 -DskipTests clean package
mvn -Pyarn-alpha -Phadoop-2.3 -Dhadoop.version=2.3.0 -Dyarn.version=0.23.7 -DskipTests clean package
{% endhighlight %}
-<!--- TODO: Update this when Hive 0.13 JDBC is added -->
-
# Building With Hive and JDBC Support
To enable Hive integration for Spark SQL along with its JDBC server and CLI,
add the `-Phive` profile to your existing build options. By default Spark
will build with Hive 0.13.1 bindings. You can also build for Hive 0.12.0 using
-the `-Phive-0.12.0` profile. NOTE: currently the JDBC server is only
-supported for Hive 0.12.0.
+the `-Phive-0.12.0` profile.
{% highlight bash %}
# Apache Hadoop 2.4.X with Hive 13 support
mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0 -Phive -DskipTests clean package
@@ -121,8 +118,8 @@ Tests are run by default via the [ScalaTest Maven plugin](http://www.scalatest.o
Some of the tests require Spark to be packaged first, so always run `mvn package` with `-DskipTests` the first time. The following is an example of a correct (build, test) sequence:
- mvn -Pyarn -Phadoop-2.3 -DskipTests -Phive -Phive-0.12.0 clean package
- mvn -Pyarn -Phadoop-2.3 -Phive -Phive-0.12.0 test
+ mvn -Pyarn -Phadoop-2.3 -DskipTests -Phive clean package
+ mvn -Pyarn -Phadoop-2.3 -Phive test
The ScalaTest plugin also supports running only a specific test suite as follows:
@@ -185,16 +182,16 @@ can be set to control the SBT build. For example:
Some of the tests require Spark to be packaged first, so always run `sbt/sbt assembly` the first time. The following is an example of a correct (build, test) sequence:
- sbt/sbt -Pyarn -Phadoop-2.3 -Phive -Phive-0.12.0 assembly
- sbt/sbt -Pyarn -Phadoop-2.3 -Phive -Phive-0.12.0 test
+ sbt/sbt -Pyarn -Phadoop-2.3 -Phive assembly
+ sbt/sbt -Pyarn -Phadoop-2.3 -Phive test
To run only a specific test suite as follows:
- sbt/sbt -Pyarn -Phadoop-2.3 -Phive -Phive-0.12.0 "test-only org.apache.spark.repl.ReplSuite"
+ sbt/sbt -Pyarn -Phadoop-2.3 -Phive "test-only org.apache.spark.repl.ReplSuite"
To run test suites of a specific sub project as follows:
- sbt/sbt -Pyarn -Phadoop-2.3 -Phive -Phive-0.12.0 core/test
+ sbt/sbt -Pyarn -Phadoop-2.3 -Phive core/test
# Speeding up Compilation with Zinc