From 4819baa658a6c8a3e4c5c504af284ea6091e4c35 Mon Sep 17 00:00:00 2001 From: Matei Zaharia Date: Sat, 31 Aug 2013 14:21:10 -0700 Subject: More updates, describing changes to recommended use of environment vars and new Python stuff --- conf/spark-env.sh.template | 23 ++++++++++------------- 1 file changed, 10 insertions(+), 13 deletions(-) (limited to 'conf') diff --git a/conf/spark-env.sh.template b/conf/spark-env.sh.template index c978db00d9..a367d59d64 100755 --- a/conf/spark-env.sh.template +++ b/conf/spark-env.sh.template @@ -1,24 +1,21 @@ #!/usr/bin/env bash # This file contains environment variables required to run Spark. Copy it as -# spark-env.sh and edit that to configure Spark for your site. At a minimum, -# the following two variables should be set: -# - SCALA_HOME, to point to your Scala installation, or SCALA_LIBRARY_PATH to -# point to the directory for Scala library JARs (if you install Scala as a -# Debian or RPM package, these are in a separate path, often /usr/share/java) +# spark-env.sh and edit that to configure Spark for your site. +# +# The following variables can be set in this file: +# - SPARK_LOCAL_IP, to override the IP address binds to # - MESOS_NATIVE_LIBRARY, to point to your libmesos.so if you use Mesos +# - SPARK_JAVA_OPTS, to set node-specific JVM options for Spark. Note that +# we recommend setting app-wide options in the application's driver program. +# Examples of node-specific options : -Dspark.local.dir, GC options +# Examples of app-wide options : -Dspark.serializer # -# If using the standalone deploy mode, you can also set variables for it: +# If using the standalone deploy mode, you can also set variables for it here: # - SPARK_MASTER_IP, to bind the master to a different IP address # - SPARK_MASTER_PORT / SPARK_MASTER_WEBUI_PORT, to use non-default ports # - SPARK_WORKER_CORES, to set the number of cores to use on this machine # - SPARK_WORKER_MEMORY, to set how much memory to use (e.g. 1000m, 2g) # - SPARK_WORKER_PORT / SPARK_WORKER_WEBUI_PORT -# - SPARK_WORKER_INSTANCES, to set the number of worker instances/processes -# to be spawned on every slave machine -# - SPARK_JAVA_OPTS, to set the jvm options for executor backend. Note: This is -# only for node-specific options, whereas app-specific options should be set -# in the application. -# Examples of node-speicic options : -Dspark.local.dir, GC related options. -# Examples of app-specific options : -Dspark.serializer +# - SPARK_WORKER_INSTANCES, to set the number of worker processes per node -- cgit v1.2.3