aboutsummaryrefslogtreecommitdiff
path: root/README
diff options
context:
space:
mode:
authorMatei Zaharia <matei@eecs.berkeley.edu>2011-02-02 19:21:49 -0800
committerMatei Zaharia <matei@eecs.berkeley.edu>2011-02-02 19:21:49 -0800
commitc1c766a93c0b5530ae42d722c3e3cbe4f4029ef0 (patch)
tree340fe181bda9311a79e5d388429c2a43e98d2715 /README
parent50df43bf7b1faf2b19d715eccb85c4a9d90b35a4 (diff)
downloadspark-c1c766a93c0b5530ae42d722c3e3cbe4f4029ef0.tar.gz
spark-c1c766a93c0b5530ae42d722c3e3cbe4f4029ef0.tar.bz2
spark-c1c766a93c0b5530ae42d722c3e3cbe4f4029ef0.zip
Updated readme
Diffstat (limited to 'README')
-rw-r--r--README15
1 files changed, 7 insertions, 8 deletions
diff --git a/README b/README
index d60b143085..a75830a9d5 100644
--- a/README
+++ b/README
@@ -1,13 +1,14 @@
BUILDING
-Spark requires Scala 2.8. This version has been tested with 2.8.0.final.
+Spark requires Scala 2.8. This version has been tested with 2.8.1.final.
-To build and run Spark, you will need to have Scala's bin in your $PATH,
-or you will need to set the SCALA_HOME environment variable to point
-to where you've installed Scala. Scala must be accessible through one
-of these methods on Mesos slave nodes as well as on the master.
+The project is built using Simple Build Tool (SBT), which is packaged with it.
+To build Spark and its example programs, run sbt/sbt compile.
-To build Spark and the example programs, run make.
+To run Spark, you will need to have Scala's bin in your $PATH, or you
+will need to set the SCALA_HOME environment variable to point to where
+you've installed Scala. Scala must be accessible through one of these
+methods on Mesos slave nodes as well as on the master.
To run one of the examples, use ./run <class> <params>. For example,
./run spark.examples.SparkLR will run the Logistic Regression example.
@@ -17,8 +18,6 @@ All of the Spark samples take a <host> parameter that is the Mesos master
to connect to. This can be a Mesos URL, or "local" to run locally with one
thread, or "local[N]" to run locally with N threads.
-Tip: If you are building Spark and examples repeatedly, export USE_FSC=1
-to have the Makefile use the fsc compiler daemon instead of scalac.
CONFIGURATION