diff options
author | Kousuke Saruta <sarutak@oss.nttdata.co.jp> | 2014-08-09 21:10:43 -0700 |
---|---|---|
committer | Patrick Wendell <pwendell@gmail.com> | 2014-08-09 21:11:00 -0700 |
commit | 4f4a9884d9268ba9808744b3d612ac23c75f105a (patch) | |
tree | 8cf35287ff81866efd3b3ce95d45ce08891b21a0 /bin/spark-shell | |
parent | e45daf226d780f4a7aaabc2de9f04367bee16f26 (diff) | |
download | spark-4f4a9884d9268ba9808744b3d612ac23c75f105a.tar.gz spark-4f4a9884d9268ba9808744b3d612ac23c75f105a.tar.bz2 spark-4f4a9884d9268ba9808744b3d612ac23c75f105a.zip |
[SPARK-2894] spark-shell doesn't accept flags
As sryza reported, spark-shell doesn't accept any flags.
The root cause is wrong usage of spark-submit in spark-shell and it come to the surface by #1801
Author: Kousuke Saruta <sarutak@oss.nttdata.co.jp>
Author: Cheng Lian <lian.cs.zju@gmail.com>
Closes #1715, Closes #1864, and Closes #1861
Closes #1825 from sarutak/SPARK-2894 and squashes the following commits:
47f3510 [Kousuke Saruta] Merge branch 'master' of git://git.apache.org/spark into SPARK-2894
2c899ed [Kousuke Saruta] Removed useless code from java_gateway.py
98287ed [Kousuke Saruta] Removed useless code from java_gateway.py
513ad2e [Kousuke Saruta] Modified util.sh to enable to use option including white spaces
28a374e [Kousuke Saruta] Modified java_gateway.py to recognize arguments
5afc584 [Cheng Lian] Filter out spark-submit options when starting Python gateway
e630d19 [Cheng Lian] Fixing pyspark and spark-shell CLI options
Diffstat (limited to 'bin/spark-shell')
-rwxr-xr-x | bin/spark-shell | 20 |
1 files changed, 14 insertions, 6 deletions
diff --git a/bin/spark-shell b/bin/spark-shell index 756c8179d1..8b7ccd7439 100755 --- a/bin/spark-shell +++ b/bin/spark-shell @@ -31,13 +31,21 @@ set -o posix ## Global script variables FWDIR="$(cd `dirname $0`/..; pwd)" +function usage() { + echo "Usage: ./bin/spark-shell [options]" + $FWDIR/bin/spark-submit --help 2>&1 | grep -v Usage 1>&2 + exit 0 +} + if [[ "$@" = *--help ]] || [[ "$@" = *-h ]]; then - echo "Usage: ./bin/spark-shell [options]" - $FWDIR/bin/spark-submit --help 2>&1 | grep -v Usage 1>&2 - exit 0 + usage fi -function main(){ +source $FWDIR/bin/utils.sh +SUBMIT_USAGE_FUNCTION=usage +gatherSparkSubmitOpts "$@" + +function main() { if $cygwin; then # Workaround for issue involving JLine and Cygwin # (see http://sourceforge.net/p/jline/bugs/40/). @@ -46,11 +54,11 @@ function main(){ # (see https://github.com/sbt/sbt/issues/562). stty -icanon min 1 -echo > /dev/null 2>&1 export SPARK_SUBMIT_OPTS="$SPARK_SUBMIT_OPTS -Djline.terminal=unix" - $FWDIR/bin/spark-submit --class org.apache.spark.repl.Main spark-shell "$@" + $FWDIR/bin/spark-submit --class org.apache.spark.repl.Main "${SUBMISSION_OPTS[@]}" spark-shell "${APPLICATION_OPTS[@]}" stty icanon echo > /dev/null 2>&1 else export SPARK_SUBMIT_OPTS - $FWDIR/bin/spark-submit --class org.apache.spark.repl.Main spark-shell "$@" + $FWDIR/bin/spark-submit --class org.apache.spark.repl.Main "${SUBMISSION_OPTS[@]}" spark-shell "${APPLICATION_OPTS[@]}" fi } |