diff options
author | Josh Rosen <joshrosen@databricks.com> | 2015-12-31 20:23:19 -0800 |
---|---|---|
committer | Josh Rosen <joshrosen@databricks.com> | 2015-12-31 20:23:19 -0800 |
commit | 5adec63a922d6f60cd6cb87ebdab61a17131ac1a (patch) | |
tree | d7632e0c70d0e33135dcaee4e68bfd07fa15ef53 /dev | |
parent | efb10cc9ad370955cec64e8f63a3b646058a9840 (diff) | |
download | spark-5adec63a922d6f60cd6cb87ebdab61a17131ac1a.tar.gz spark-5adec63a922d6f60cd6cb87ebdab61a17131ac1a.tar.bz2 spark-5adec63a922d6f60cd6cb87ebdab61a17131ac1a.zip |
[SPARK-10359][PROJECT-INFRA] Multiple fixes to dev/test-dependencies.sh script
This patch includes multiple fixes for the `dev/test-dependencies.sh` script (which was introduced in #10461):
- Use `build/mvn --force` instead of `mvn` in one additional place.
- Explicitly set a zero exit code on success.
- Set `LC_ALL=C` to make `sort` results agree across machines (see https://stackoverflow.com/questions/28881/).
- Set `should_run_build_tests=True` for `build` module (this somehow got lost).
Author: Josh Rosen <joshrosen@databricks.com>
Closes #10543 from JoshRosen/dep-script-fixes.
Diffstat (limited to 'dev')
-rw-r--r-- | dev/sparktestsupport/modules.py | 3 | ||||
-rwxr-xr-x | dev/test-dependencies.sh | 8 |
2 files changed, 9 insertions, 2 deletions
diff --git a/dev/sparktestsupport/modules.py b/dev/sparktestsupport/modules.py index 4667b289f5..47cd600bd1 100644 --- a/dev/sparktestsupport/modules.py +++ b/dev/sparktestsupport/modules.py @@ -402,7 +402,8 @@ build = Module( source_file_regexes=[ ".*pom.xml", "dev/test-dependencies.sh", - ] + ], + should_run_build_tests=True ) ec2 = Module( diff --git a/dev/test-dependencies.sh b/dev/test-dependencies.sh index 984e29d1be..4e260e2abf 100755 --- a/dev/test-dependencies.sh +++ b/dev/test-dependencies.sh @@ -22,6 +22,10 @@ set -e FWDIR="$(cd "`dirname $0`"/..; pwd)" cd "$FWDIR" +# Explicitly set locale in order to make `sort` output consistent across machines. +# See https://stackoverflow.com/questions/28881 for more details. +export LC_ALL=C + # TODO: This would be much nicer to do in SBT, once SBT supports Maven-style resolution. # NOTE: These should match those in the release publishing script @@ -37,7 +41,7 @@ HADOOP_PROFILES=( # resolve Spark's internal submodule dependencies. # See http://stackoverflow.com/a/3545363 for an explanation of this one-liner: -OLD_VERSION=$(mvn help:evaluate -Dexpression=project.version|grep -Ev '(^\[|Download\w+:)') +OLD_VERSION=$($MVN help:evaluate -Dexpression=project.version|grep -Ev '(^\[|Download\w+:)') TEMP_VERSION="spark-$(date +%s | tail -c6)" function reset_version { @@ -100,3 +104,5 @@ for HADOOP_PROFILE in "${HADOOP_PROFILES[@]}"; do exit 1 fi done + +exit 0 |