aboutsummaryrefslogtreecommitdiff
path: root/project
diff options
context:
space:
mode:
authorMarcelo Vanzin <vanzin@cloudera.com>2015-06-05 14:11:38 +0200
committerSean Owen <sowen@cloudera.com>2015-06-05 14:11:38 +0200
commitb16b5434ff44c42e4b3a337f9af147669ba44896 (patch)
tree47d87caa16e88d2e08a4f1794227ec5d5caff795 /project
parentda20c8ca37663738112b04657057858ee3e55072 (diff)
downloadspark-b16b5434ff44c42e4b3a337f9af147669ba44896.tar.gz
spark-b16b5434ff44c42e4b3a337f9af147669ba44896.tar.bz2
spark-b16b5434ff44c42e4b3a337f9af147669ba44896.zip
[MINOR] [BUILD] Use custom temp directory during build.
Even with all the efforts to cleanup the temp directories created by unit tests, Spark leaves a lot of garbage in /tmp after a test run. This change overrides java.io.tmpdir to place those files under the build directory instead. After an sbt full unit test run, I was left with > 400 MB of temp files. Since they're now under the build dir, it's much easier to clean them up. Also make a slight change to a unit test to make it not pollute the source directory with test data. Author: Marcelo Vanzin <vanzin@cloudera.com> Closes #6653 from vanzin/unit-test-tmp and squashes the following commits: 31e2dd5 [Marcelo Vanzin] Fix tests that depend on each other. aa92944 [Marcelo Vanzin] [minor] [build] Use custom temp directory during build.
Diffstat (limited to 'project')
-rw-r--r--project/SparkBuild.scala1
1 files changed, 1 insertions, 0 deletions
diff --git a/project/SparkBuild.scala b/project/SparkBuild.scala
index ef3a175bac..921f1599fe 100644
--- a/project/SparkBuild.scala
+++ b/project/SparkBuild.scala
@@ -496,6 +496,7 @@ object TestSettings {
"SPARK_DIST_CLASSPATH" ->
(fullClasspath in Test).value.files.map(_.getAbsolutePath).mkString(":").stripSuffix(":"),
"JAVA_HOME" -> sys.env.get("JAVA_HOME").getOrElse(sys.props("java.home"))),
+ javaOptions in Test += s"-Djava.io.tmpdir=$sparkHome/target/tmp",
javaOptions in Test += "-Dspark.test.home=" + sparkHome,
javaOptions in Test += "-Dspark.testing=1",
javaOptions in Test += "-Dspark.port.maxRetries=100",