diff options
author | Josh Rosen <joshrosen@databricks.com> | 2016-01-30 00:20:28 -0800 |
---|---|---|
committer | Reynold Xin <rxin@databricks.com> | 2016-01-30 00:20:28 -0800 |
commit | 289373b28cd2546165187de2e6a9185a1257b1e7 (patch) | |
tree | b541a6e52a4ff20604689efafbfa0df7ad0901f5 /repl/scala-2.11/src/test | |
parent | dab246f7e4664d36073ec49d9df8a11c5e998cdb (diff) | |
download | spark-289373b28cd2546165187de2e6a9185a1257b1e7.tar.gz spark-289373b28cd2546165187de2e6a9185a1257b1e7.tar.bz2 spark-289373b28cd2546165187de2e6a9185a1257b1e7.zip |
[SPARK-6363][BUILD] Make Scala 2.11 the default Scala version
This patch changes Spark's build to make Scala 2.11 the default Scala version. To be clear, this does not mean that Spark will stop supporting Scala 2.10: users will still be able to compile Spark for Scala 2.10 by following the instructions on the "Building Spark" page; however, it does mean that Scala 2.11 will be the default Scala version used by our CI builds (including pull request builds).
The Scala 2.11 compiler is faster than 2.10, so I think we'll be able to look forward to a slight speedup in our CI builds (it looks like it's about 2X faster for the Maven compile-only builds, for instance).
After this patch is merged, I'll update Jenkins to add new compile-only jobs to ensure that Scala 2.10 compilation doesn't break.
Author: Josh Rosen <joshrosen@databricks.com>
Closes #10608 from JoshRosen/SPARK-6363.
Diffstat (limited to 'repl/scala-2.11/src/test')
-rw-r--r-- | repl/scala-2.11/src/test/scala/org/apache/spark/repl/ReplSuite.scala | 7 |
1 files changed, 1 insertions, 6 deletions
diff --git a/repl/scala-2.11/src/test/scala/org/apache/spark/repl/ReplSuite.scala b/repl/scala-2.11/src/test/scala/org/apache/spark/repl/ReplSuite.scala index 63f3688c9e..b9ed79da42 100644 --- a/repl/scala-2.11/src/test/scala/org/apache/spark/repl/ReplSuite.scala +++ b/repl/scala-2.11/src/test/scala/org/apache/spark/repl/ReplSuite.scala @@ -50,12 +50,7 @@ class ReplSuite extends SparkFunSuite { System.setProperty(CONF_EXECUTOR_CLASSPATH, classpath) System.setProperty("spark.master", master) - val interp = { - new SparkILoop(in, new PrintWriter(out)) - } - org.apache.spark.repl.Main.interp = interp - Main.main(Array("-classpath", classpath)) // call main - org.apache.spark.repl.Main.interp = null + Main.doMain(Array("-classpath", classpath), new SparkILoop(in, new PrintWriter(out))) if (oldExecutorClasspath != null) { System.setProperty(CONF_EXECUTOR_CLASSPATH, oldExecutorClasspath) |