diff options
Diffstat (limited to 'README')
-rw-r--r-- | README | 23 |
1 files changed, 22 insertions, 1 deletions
@@ -1,4 +1,6 @@ -Spark requires Scala 2.8. This version has been tested with 2.8.0RC3. +BUILDING + +Spark requires Scala 2.8. This version has been tested with 2.8.0.final. To build and run Spark, you will need to have Scala's bin in your $PATH, or you will need to set the SCALA_HOME environment variable to point @@ -13,3 +15,22 @@ example programs prints usage help if no params are given. Tip: If you are building Spark and examples repeatedly, export USE_FSC=1 to have the Makefile use the fsc compiler daemon instead of scalac. + +CONFIGURATION + +Spark can be configured through two files: conf/java-opts and conf/spark-env.sh. + +In java-opts, you can add flags to be passed to the JVM when running Spark. + +In spark-env.sh, you can set any environment variables you wish to be available +when running Spark programs, such as PATH, SCALA_HOME, etc. There are also +several Spark-specific variables you can set: +- SPARK_CLASSPATH: Extra entries to be added to the classpath, separated by ":". +- SPARK_MEM: Memory for Spark to use, in the format used by java's -Xmx option + (for example, 200m meams 200 MB, 1g means 1 GB, etc). +- SPARK_LIBRARY_PATH: Extra entries to add to java.library.path for locating + shared libraries. +- SPARK_JAVA_OPTS: Extra options to pass to JVM. + +Note that spark-env.sh must be a shell script (it must be executable and start +with a #! header to specify the shell to use). |