diff options
author | Josh Rosen <joshrosen@databricks.com> | 2016-01-04 01:04:29 -0800 |
---|---|---|
committer | Josh Rosen <joshrosen@databricks.com> | 2016-01-04 01:04:29 -0800 |
commit | 9fd7a2f0247ed6cea0e8dbcdd2b24f41200b3e24 (patch) | |
tree | 6949efd99af864cee020c6e6b930aaa889b9000f /dev | |
parent | 0d165ec2050aa06aa545d74c8d7c2ff197fa02de (diff) | |
download | spark-9fd7a2f0247ed6cea0e8dbcdd2b24f41200b3e24.tar.gz spark-9fd7a2f0247ed6cea0e8dbcdd2b24f41200b3e24.tar.bz2 spark-9fd7a2f0247ed6cea0e8dbcdd2b24f41200b3e24.zip |
[SPARK-10359][PROJECT-INFRA] Use more random number in dev/test-dependencies.sh; fix version switching
This patch aims to fix another potential source of flakiness in the `dev/test-dependencies.sh` script.
pwendell's original patch and my version used `$(date +%s | tail -c6)` to generate a suffix to use when installing temporary Spark versions into the local Maven cache, but this value only changes once per second and thus is highly collision-prone when concurrent builds launch on AMPLab Jenkins. In order to reduce the potential for conflicts, this patch updates the script to call Python's random number generator instead.
I also fixed a bug in how we captured the original project version; the bug was causing the exit handler code to fail.
Author: Josh Rosen <joshrosen@databricks.com>
Closes #10558 from JoshRosen/build-dep-tests-round-3.
Diffstat (limited to 'dev')
-rwxr-xr-x | dev/run-tests.py | 4 | ||||
-rwxr-xr-x | dev/test-dependencies.sh | 16 |
2 files changed, 15 insertions, 5 deletions
diff --git a/dev/run-tests.py b/dev/run-tests.py index 9db728d799..8726889cbc 100755 --- a/dev/run-tests.py +++ b/dev/run-tests.py @@ -419,8 +419,8 @@ def run_python_tests(test_modules, parallelism): def run_build_tests(): - # set_title_and_block("Running build tests", "BLOCK_BUILD_TESTS") - # run_cmd([os.path.join(SPARK_HOME, "dev", "test-dependencies.sh")]) + set_title_and_block("Running build tests", "BLOCK_BUILD_TESTS") + run_cmd([os.path.join(SPARK_HOME, "dev", "test-dependencies.sh")]) pass diff --git a/dev/test-dependencies.sh b/dev/test-dependencies.sh index d6a32717f5..424ce6ad76 100755 --- a/dev/test-dependencies.sh +++ b/dev/test-dependencies.sh @@ -42,9 +42,19 @@ HADOOP_PROFILES=( # the old version. We need to do this because the `dependency:build-classpath` task needs to # resolve Spark's internal submodule dependencies. -# See http://stackoverflow.com/a/3545363 for an explanation of this one-liner: -OLD_VERSION=$($MVN help:evaluate -Dexpression=project.version|grep -Ev '(^\[|Download\w+:)') -TEMP_VERSION="spark-$(date +%s | tail -c6)" +# From http://stackoverflow.com/a/26514030 +set +e +OLD_VERSION=$($MVN -q \ + -Dexec.executable="echo" \ + -Dexec.args='${project.version}' \ + --non-recursive \ + org.codehaus.mojo:exec-maven-plugin:1.3.1:exec) +if [ $? != 0 ]; then + echo -e "Error while getting version string from Maven:\n$OLD_VERSION" + exit 1 +fi +set -e +TEMP_VERSION="spark-$(python -S -c "import random; print(random.randrange(100000, 999999))")" function reset_version { # Delete the temporary POMs that we wrote to the local Maven repo: |