diff options
author | Aaron Davidson <aaron@databricks.com> | 2014-04-07 13:06:30 -0700 |
---|---|---|
committer | Patrick Wendell <pwendell@gmail.com> | 2014-04-07 13:06:30 -0700 |
commit | 0307db0f55b714930c7ea118d5451190ea8c1a94 (patch) | |
tree | aef07717fd1658760a51d77d2b22445bbfe9921e /bin | |
parent | 2a2ca48be61ed0d72c4347e1c042a264b94db3e8 (diff) | |
download | spark-0307db0f55b714930c7ea118d5451190ea8c1a94.tar.gz spark-0307db0f55b714930c7ea118d5451190ea8c1a94.tar.bz2 spark-0307db0f55b714930c7ea118d5451190ea8c1a94.zip |
SPARK-1099: Introduce local[*] mode to infer number of cores
This is the default mode for running spark-shell and pyspark, intended to allow users running spark for the first time to see the performance benefits of using multiple cores, while not breaking backwards compatibility for users who use "local" mode and expect exactly 1 core.
Author: Aaron Davidson <aaron@databricks.com>
Closes #182 from aarondav/110 and squashes the following commits:
a88294c [Aaron Davidson] Rebased changes for new spark-shell
a9f393e [Aaron Davidson] SPARK-1099: Introduce local[*] mode to infer number of cores
Diffstat (limited to 'bin')
-rwxr-xr-x | bin/spark-shell | 4 |
1 files changed, 2 insertions, 2 deletions
diff --git a/bin/spark-shell b/bin/spark-shell index 535ee3ccd8..ea12d256b2 100755 --- a/bin/spark-shell +++ b/bin/spark-shell @@ -34,7 +34,7 @@ set -o posix FWDIR="$(cd `dirname $0`/..; pwd)" SPARK_REPL_OPTS="${SPARK_REPL_OPTS:-""}" -DEFAULT_MASTER="local" +DEFAULT_MASTER="local[*]" MASTER=${MASTER:-""} info_log=0 @@ -64,7 +64,7 @@ ${txtbld}OPTIONS${txtrst}: is followed by m for megabytes or g for gigabytes, e.g. "1g". -dm --driver-memory : The memory used by the Spark Shell, the number is followed by m for megabytes or g for gigabytes, e.g. "1g". - -m --master : A full string that describes the Spark Master, defaults to "local" + -m --master : A full string that describes the Spark Master, defaults to "local[*]" e.g. "spark://localhost:7077". --log-conf : Enables logging of the supplied SparkConf as INFO at start of the Spark Context. |