From 0435de9e8710ffd2d24a65ef4371529e79d3bf3c Mon Sep 17 00:00:00 2001 From: Matei Zaharia Date: Mon, 19 Jul 2010 18:00:30 -0700 Subject: Made it possible to set various Spark options and environment variables in general through a conf/spark-env.sh script. --- README | 23 ++++++++++++++++++++++- 1 file changed, 22 insertions(+), 1 deletion(-) (limited to 'README') diff --git a/README b/README index e0c7603632..6af34a2294 100644 --- a/README +++ b/README @@ -1,4 +1,6 @@ -Spark requires Scala 2.8. This version has been tested with 2.8.0RC3. +BUILDING + +Spark requires Scala 2.8. This version has been tested with 2.8.0.final. To build and run Spark, you will need to have Scala's bin in your $PATH, or you will need to set the SCALA_HOME environment variable to point @@ -13,3 +15,22 @@ example programs prints usage help if no params are given. Tip: If you are building Spark and examples repeatedly, export USE_FSC=1 to have the Makefile use the fsc compiler daemon instead of scalac. + +CONFIGURATION + +Spark can be configured through two files: conf/java-opts and conf/spark-env.sh. + +In java-opts, you can add flags to be passed to the JVM when running Spark. + +In spark-env.sh, you can set any environment variables you wish to be available +when running Spark programs, such as PATH, SCALA_HOME, etc. There are also +several Spark-specific variables you can set: +- SPARK_CLASSPATH: Extra entries to be added to the classpath, separated by ":". +- SPARK_MEM: Memory for Spark to use, in the format used by java's -Xmx option + (for example, 200m meams 200 MB, 1g means 1 GB, etc). +- SPARK_LIBRARY_PATH: Extra entries to add to java.library.path for locating + shared libraries. +- SPARK_JAVA_OPTS: Extra options to pass to JVM. + +Note that spark-env.sh must be a shell script (it must be executable and start +with a #! header to specify the shell to use). -- cgit v1.2.3