aboutsummaryrefslogtreecommitdiff
path: root/project/SparkBuild.scala
diff options
context:
space:
mode:
authorMarcelo Vanzin <vanzin@cloudera.com>2015-06-08 15:37:28 +0100
committerSean Owen <sowen@cloudera.com>2015-06-08 15:37:28 +0100
commita1d9e5cc60d317ecf8fe390b66b623ae39c4534d (patch)
tree1801288569cadc5f429fd33baf058b2504e11792 /project/SparkBuild.scala
parent03ef6be9ce61a13dcd9d8c71298fb4be39119411 (diff)
downloadspark-a1d9e5cc60d317ecf8fe390b66b623ae39c4534d.tar.gz
spark-a1d9e5cc60d317ecf8fe390b66b623ae39c4534d.tar.bz2
spark-a1d9e5cc60d317ecf8fe390b66b623ae39c4534d.zip
[SPARK-8126] [BUILD] Use custom temp directory during build.
Even with all the efforts to cleanup the temp directories created by unit tests, Spark leaves a lot of garbage in /tmp after a test run. This change overrides java.io.tmpdir to place those files under the build directory instead. After an sbt full unit test run, I was left with > 400 MB of temp files. Since they're now under the build dir, it's much easier to clean them up. Also make a slight change to a unit test to make it not pollute the source directory with test data. Author: Marcelo Vanzin <vanzin@cloudera.com> Closes #6674 from vanzin/SPARK-8126 and squashes the following commits: 0f8ad41 [Marcelo Vanzin] Make sure tmp dir exists when tests run. 643e916 [Marcelo Vanzin] [MINOR] [BUILD] Use custom temp directory during build.
Diffstat (limited to 'project/SparkBuild.scala')
-rw-r--r--project/SparkBuild.scala6
1 files changed, 6 insertions, 0 deletions
diff --git a/project/SparkBuild.scala b/project/SparkBuild.scala
index ef3a175bac..d7e374558c 100644
--- a/project/SparkBuild.scala
+++ b/project/SparkBuild.scala
@@ -51,6 +51,11 @@ object BuildCommons {
// Root project.
val spark = ProjectRef(buildLocation, "spark")
val sparkHome = buildLocation
+
+ val testTempDir = s"$sparkHome/target/tmp"
+ if (!new File(testTempDir).isDirectory()) {
+ require(new File(testTempDir).mkdirs())
+ }
}
object SparkBuild extends PomBuild {
@@ -496,6 +501,7 @@ object TestSettings {
"SPARK_DIST_CLASSPATH" ->
(fullClasspath in Test).value.files.map(_.getAbsolutePath).mkString(":").stripSuffix(":"),
"JAVA_HOME" -> sys.env.get("JAVA_HOME").getOrElse(sys.props("java.home"))),
+ javaOptions in Test += s"-Djava.io.tmpdir=$testTempDir",
javaOptions in Test += "-Dspark.test.home=" + sparkHome,
javaOptions in Test += "-Dspark.testing=1",
javaOptions in Test += "-Dspark.port.maxRetries=100",