From 289373b28cd2546165187de2e6a9185a1257b1e7 Mon Sep 17 00:00:00 2001 From: Josh Rosen Date: Sat, 30 Jan 2016 00:20:28 -0800 Subject: [SPARK-6363][BUILD] Make Scala 2.11 the default Scala version This patch changes Spark's build to make Scala 2.11 the default Scala version. To be clear, this does not mean that Spark will stop supporting Scala 2.10: users will still be able to compile Spark for Scala 2.10 by following the instructions on the "Building Spark" page; however, it does mean that Scala 2.11 will be the default Scala version used by our CI builds (including pull request builds). The Scala 2.11 compiler is faster than 2.10, so I think we'll be able to look forward to a slight speedup in our CI builds (it looks like it's about 2X faster for the Maven compile-only builds, for instance). After this patch is merged, I'll update Jenkins to add new compile-only jobs to ensure that Scala 2.10 compilation doesn't break. Author: Josh Rosen Closes #10608 from JoshRosen/SPARK-6363. --- docs/_plugins/copy_api_dirs.rb | 2 +- docs/building-spark.md | 10 ++++------ 2 files changed, 5 insertions(+), 7 deletions(-) (limited to 'docs') diff --git a/docs/_plugins/copy_api_dirs.rb b/docs/_plugins/copy_api_dirs.rb index 174c202e37..f926d67e6b 100644 --- a/docs/_plugins/copy_api_dirs.rb +++ b/docs/_plugins/copy_api_dirs.rb @@ -37,7 +37,7 @@ if not (ENV['SKIP_API'] == '1') # Copy over the unified ScalaDoc for all projects to api/scala. # This directory will be copied over to _site when `jekyll` command is run. - source = "../target/scala-2.10/unidoc" + source = "../target/scala-2.11/unidoc" dest = "api/scala" puts "Making directory " + dest diff --git a/docs/building-spark.md b/docs/building-spark.md index e1abcf1be5..975e1b295c 100644 --- a/docs/building-spark.md +++ b/docs/building-spark.md @@ -114,13 +114,11 @@ By default Spark will build with Hive 0.13.1 bindings. mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0 -Phive -Phive-thriftserver -DskipTests clean package {% endhighlight %} -# Building for Scala 2.11 -To produce a Spark package compiled with Scala 2.11, use the `-Dscala-2.11` property: +# Building for Scala 2.10 +To produce a Spark package compiled with Scala 2.10, use the `-Dscala-2.10` property: - ./dev/change-scala-version.sh 2.11 - mvn -Pyarn -Phadoop-2.4 -Dscala-2.11 -DskipTests clean package - -Spark does not yet support its JDBC component for Scala 2.11. + ./dev/change-scala-version.sh 2.10 + mvn -Pyarn -Phadoop-2.4 -Dscala-2.10 -DskipTests clean package # Spark Tests in Maven -- cgit v1.2.3