From d90ddf12b6bea2162e982e800c96d2c94f66b347 Mon Sep 17 00:00:00 2001 From: Sandy Ryza Date: Fri, 14 Nov 2014 14:21:57 -0800 Subject: SPARK-4375. no longer require -Pscala-2.10 It seems like the winds might have moved away from this approach, but wanted to post the PR anyway because I got it working and to show what it would look like. Author: Sandy Ryza Closes #3239 from sryza/sandy-spark-4375 and squashes the following commits: 0ffbe95 [Sandy Ryza] Enable -Dscala-2.11 in sbt cd42d94 [Sandy Ryza] Update doc f6644c3 [Sandy Ryza] SPARK-4375 take 2 (cherry picked from commit f5f757e4ed80759dc5668c63d5663651689f8da8) Signed-off-by: Patrick Wendell --- docs/building-spark.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) (limited to 'docs') diff --git a/docs/building-spark.md b/docs/building-spark.md index 20ba7da5d7..bb18414092 100644 --- a/docs/building-spark.md +++ b/docs/building-spark.md @@ -113,9 +113,9 @@ mvn -Pyarn -Phive -Phive-thriftserver-0.12.0 -Phadoop-2.4 -Dhadoop.version=2.4.0 {% endhighlight %} # Building for Scala 2.11 -To produce a Spark package compiled with Scala 2.11, use the `-Pscala-2.11` profile: +To produce a Spark package compiled with Scala 2.11, use the `-Dscala-2.11` property: - mvn -Pyarn -Phadoop-2.4 -Pscala-2.11 -DskipTests clean package + mvn -Pyarn -Phadoop-2.4 -Dscala-2.11 -DskipTests clean package Scala 2.11 support in Spark is experimental and does not support a few features. Specifically, Spark's external Kafka library and JDBC component are not yet -- cgit v1.2.3