aboutsummaryrefslogtreecommitdiff
path: root/README
diff options
context:
space:
mode:
authorMatei Zaharia <matei@eecs.berkeley.edu>2010-07-19 18:00:30 -0700
committerMatei Zaharia <matei@eecs.berkeley.edu>2010-07-19 18:00:30 -0700
commit0435de9e8710ffd2d24a65ef4371529e79d3bf3c (patch)
treed8edca9a808d8615dadb110b3c10aa7fd1ee1344 /README
parentedad598684236d6271ce7853a8312081d15a28a6 (diff)
downloadspark-0435de9e8710ffd2d24a65ef4371529e79d3bf3c.tar.gz
spark-0435de9e8710ffd2d24a65ef4371529e79d3bf3c.tar.bz2
spark-0435de9e8710ffd2d24a65ef4371529e79d3bf3c.zip
Made it possible to set various Spark options and environment variables
in general through a conf/spark-env.sh script.
Diffstat (limited to 'README')
-rw-r--r--README23
1 files changed, 22 insertions, 1 deletions
diff --git a/README b/README
index e0c7603632..6af34a2294 100644
--- a/README
+++ b/README
@@ -1,4 +1,6 @@
-Spark requires Scala 2.8. This version has been tested with 2.8.0RC3.
+BUILDING
+
+Spark requires Scala 2.8. This version has been tested with 2.8.0.final.
To build and run Spark, you will need to have Scala's bin in your $PATH,
or you will need to set the SCALA_HOME environment variable to point
@@ -13,3 +15,22 @@ example programs prints usage help if no params are given.
Tip: If you are building Spark and examples repeatedly, export USE_FSC=1
to have the Makefile use the fsc compiler daemon instead of scalac.
+
+CONFIGURATION
+
+Spark can be configured through two files: conf/java-opts and conf/spark-env.sh.
+
+In java-opts, you can add flags to be passed to the JVM when running Spark.
+
+In spark-env.sh, you can set any environment variables you wish to be available
+when running Spark programs, such as PATH, SCALA_HOME, etc. There are also
+several Spark-specific variables you can set:
+- SPARK_CLASSPATH: Extra entries to be added to the classpath, separated by ":".
+- SPARK_MEM: Memory for Spark to use, in the format used by java's -Xmx option
+ (for example, 200m meams 200 MB, 1g means 1 GB, etc).
+- SPARK_LIBRARY_PATH: Extra entries to add to java.library.path for locating
+ shared libraries.
+- SPARK_JAVA_OPTS: Extra options to pass to JVM.
+
+Note that spark-env.sh must be a shell script (it must be executable and start
+with a #! header to specify the shell to use).