aboutsummaryrefslogblamecommitdiff
path: root/conf/spark-env.sh.template
blob: c978db00d9574553ea37c10dae516ce9c7a30f73 (plain) (tree)
1
2
3
4
5
6
7
8
9

                   


                                                                            



                                                                               






                                                                         

                                                                           




                                                                               
 
#!/usr/bin/env bash

# This file contains environment variables required to run Spark. Copy it as
# spark-env.sh and edit that to configure Spark for your site. At a minimum,
# the following two variables should be set:
# - SCALA_HOME, to point to your Scala installation, or SCALA_LIBRARY_PATH to
#   point to the directory for Scala library JARs (if you install Scala as a
#   Debian or RPM package, these are in a separate path, often /usr/share/java)
# - MESOS_NATIVE_LIBRARY, to point to your libmesos.so if you use Mesos
#
# If using the standalone deploy mode, you can also set variables for it:
# - SPARK_MASTER_IP, to bind the master to a different IP address
# - SPARK_MASTER_PORT / SPARK_MASTER_WEBUI_PORT, to use non-default ports
# - SPARK_WORKER_CORES, to set the number of cores to use on this machine
# - SPARK_WORKER_MEMORY, to set how much memory to use (e.g. 1000m, 2g)
# - SPARK_WORKER_PORT / SPARK_WORKER_WEBUI_PORT
# - SPARK_WORKER_INSTANCES, to set the number of worker instances/processes
#   to be spawned on every slave machine
# - SPARK_JAVA_OPTS, to set the jvm options for executor backend. Note: This is
#   only for node-specific options, whereas app-specific options should be set
#   in the application.
#   Examples of node-speicic options : -Dspark.local.dir, GC related options.
#   Examples of app-specific options : -Dspark.serializer