aboutsummaryrefslogtreecommitdiff
path: root/conf
diff options
context:
space:
mode:
authorMatei Zaharia <matei.zaharia@gmail.com>2013-08-31 17:49:45 -0700
committerMatei Zaharia <matei.zaharia@gmail.com>2013-08-31 17:49:45 -0700
commit2b29a1d43f5982333d57498427c83155f531fa93 (patch)
tree71b11b1b809d70ff236e0f49de652b637b4c2d33 /conf
parent6edef9c833b90f9ac3ad2f677f201ab3ca39a016 (diff)
parent7862c4a3c8b900db81f5c2af157bd4564d814bd9 (diff)
downloadspark-2b29a1d43f5982333d57498427c83155f531fa93.tar.gz
spark-2b29a1d43f5982333d57498427c83155f531fa93.tar.bz2
spark-2b29a1d43f5982333d57498427c83155f531fa93.zip
Merge pull request #877 from mateiz/docs
Doc improvements for 0.8
Diffstat (limited to 'conf')
-rwxr-xr-xconf/spark-env.sh.template25
1 files changed, 11 insertions, 14 deletions
diff --git a/conf/spark-env.sh.template b/conf/spark-env.sh.template
index c978db00d9..0a35ee7c79 100755
--- a/conf/spark-env.sh.template
+++ b/conf/spark-env.sh.template
@@ -1,24 +1,21 @@
#!/usr/bin/env bash
# This file contains environment variables required to run Spark. Copy it as
-# spark-env.sh and edit that to configure Spark for your site. At a minimum,
-# the following two variables should be set:
-# - SCALA_HOME, to point to your Scala installation, or SCALA_LIBRARY_PATH to
-# point to the directory for Scala library JARs (if you install Scala as a
-# Debian or RPM package, these are in a separate path, often /usr/share/java)
+# spark-env.sh and edit that to configure Spark for your site.
+#
+# The following variables can be set in this file:
+# - SPARK_LOCAL_IP, to set the IP address Spark binds to on this node
# - MESOS_NATIVE_LIBRARY, to point to your libmesos.so if you use Mesos
+# - SPARK_JAVA_OPTS, to set node-specific JVM options for Spark. Note that
+# we recommend setting app-wide options in the application's driver program.
+# Examples of node-specific options : -Dspark.local.dir, GC options
+# Examples of app-wide options : -Dspark.serializer
#
-# If using the standalone deploy mode, you can also set variables for it:
-# - SPARK_MASTER_IP, to bind the master to a different IP address
+# If using the standalone deploy mode, you can also set variables for it here:
+# - SPARK_MASTER_IP, to bind the master to a different IP address or hostname
# - SPARK_MASTER_PORT / SPARK_MASTER_WEBUI_PORT, to use non-default ports
# - SPARK_WORKER_CORES, to set the number of cores to use on this machine
# - SPARK_WORKER_MEMORY, to set how much memory to use (e.g. 1000m, 2g)
# - SPARK_WORKER_PORT / SPARK_WORKER_WEBUI_PORT
-# - SPARK_WORKER_INSTANCES, to set the number of worker instances/processes
-# to be spawned on every slave machine
-# - SPARK_JAVA_OPTS, to set the jvm options for executor backend. Note: This is
-# only for node-specific options, whereas app-specific options should be set
-# in the application.
-# Examples of node-speicic options : -Dspark.local.dir, GC related options.
-# Examples of app-specific options : -Dspark.serializer
+# - SPARK_WORKER_INSTANCES, to set the number of worker processes per node