diff options
author | Matei Zaharia <matei@eecs.berkeley.edu> | 2013-08-27 19:23:54 -0700 |
---|---|---|
committer | Matei Zaharia <matei@eecs.berkeley.edu> | 2013-08-29 21:19:06 -0700 |
commit | 666d93c294458cb056cb590eb11bb6cf979861e5 (patch) | |
tree | 8a05c1073bef461b141c60736052a1f029e3da38 /project/SparkBuild.scala | |
parent | d7dec938e503b86d1b338c4df3439d3649a76294 (diff) | |
download | spark-666d93c294458cb056cb590eb11bb6cf979861e5.tar.gz spark-666d93c294458cb056cb590eb11bb6cf979861e5.tar.bz2 spark-666d93c294458cb056cb590eb11bb6cf979861e5.zip |
Update Maven build to create assemblies expected by new scripts
This includes the following changes:
- The "assembly" package now builds in Maven by default, and creates an
assembly containing both hadoop-client and Spark, unlike the old
BigTop distribution assembly that skipped hadoop-client
- There is now a bigtop-dist package to build the old BigTop assembly
- The repl-bin package is no longer built by default since the scripts
don't reply on it; instead it can be enabled with -Prepl-bin
- Py4J is now included in the assembly/lib folder as a local Maven repo,
so that the Maven package can link to it
- run-example now adds the original Spark classpath as well because the
Maven examples assembly lists spark-core and such as provided
- The various Maven projects add a spark-yarn dependency correctly
Diffstat (limited to 'project/SparkBuild.scala')
-rw-r--r-- | project/SparkBuild.scala | 4 |
1 files changed, 2 insertions, 2 deletions
diff --git a/project/SparkBuild.scala b/project/SparkBuild.scala index 8797e65b8d..2e26812671 100644 --- a/project/SparkBuild.scala +++ b/project/SparkBuild.scala @@ -41,7 +41,7 @@ object SparkBuild extends Build { .dependsOn(core, bagel, mllib) dependsOn(maybeYarn: _*) lazy val examples = Project("examples", file("examples"), settings = examplesSettings) - .dependsOn(core, mllib, bagel, streaming) + .dependsOn(core, mllib, bagel, streaming) dependsOn(maybeYarn: _*) lazy val tools = Project("tools", file("tools"), settings = toolsSettings) dependsOn(core) dependsOn(streaming) @@ -261,7 +261,7 @@ object SparkBuild extends Build { def yarnSettings = sharedSettings ++ Seq( name := "spark-yarn" - ) ++ extraYarnSettings ++ assemblySettings ++ extraAssemblySettings + ) ++ extraYarnSettings // Conditionally include the YARN dependencies because some tools look at all sub-projects and will complain // if we refer to nonexistent dependencies (e.g. hadoop-yarn-api from a Hadoop version without YARN). |