From 6f688608915a82e6dcf4a27dc92e4b25a3570fa4 Mon Sep 17 00:00:00 2001 From: Mike Date: Thu, 11 Apr 2013 20:52:06 -0700 Subject: Reversed the order of tests to find a scala executable (in the case when SPARK_LAUNCH_WITH_SCALA is defined): instead of checking in the PATH first, and only then (if not found) for SCALA_HOME, now we check for SCALA_HOME first, and only then (if not defined) do we look in the PATH. The advantage is that now if the user has a more recent (non-compatible) version of scala in her PATH, she can use SCALA_HOME to point to the older (compatible) version for use with spark. Suggested by Josh Rosen in this thread: https://groups.google.com/forum/?fromgroups=#!topic/spark-users/NC9JKvP8808 --- run | 11 ++++++----- 1 file changed, 6 insertions(+), 5 deletions(-) (limited to 'run') diff --git a/run b/run index 73239097b9..756f8703f2 100755 --- a/run +++ b/run @@ -47,14 +47,15 @@ case "$1" in esac if [ "$SPARK_LAUNCH_WITH_SCALA" == "1" ]; then - if [ `command -v scala` ]; then - RUNNER="scala" + if [ "$SCALA_HOME" ]; then + RUNNER="${SCALA_HOME}/bin/scala" else - if [ -z "$SCALA_HOME" ]; then - echo "SCALA_HOME is not set" >&2 + if [ `command -v scala` ]; then + RUNNER="scala" + else + echo "SCALA_HOME is not set and scala is not in PATH" >&2 exit 1 fi - RUNNER="${SCALA_HOME}/bin/scala" fi else if [ `command -v java` ]; then -- cgit v1.2.3