diff options
author | Patrick Wendell <pwendell@gmail.com> | 2014-01-03 17:32:25 -0800 |
---|---|---|
committer | Patrick Wendell <pwendell@gmail.com> | 2014-01-03 18:30:17 -0800 |
commit | 9e6f3bdcda1ab48159afa4f54b64d05e42a8688e (patch) | |
tree | dbe72e606726f95c725bce3caed8a32a15d74b5c /README.md | |
parent | bc311bb826b5548b9c4c55320711f3b18dc19397 (diff) | |
download | spark-9e6f3bdcda1ab48159afa4f54b64d05e42a8688e.tar.gz spark-9e6f3bdcda1ab48159afa4f54b64d05e42a8688e.tar.bz2 spark-9e6f3bdcda1ab48159afa4f54b64d05e42a8688e.zip |
Changes on top of Prashant's patch.
Closes #316
Diffstat (limited to 'README.md')
-rw-r--r-- | README.md | 19 |
1 files changed, 3 insertions, 16 deletions
@@ -13,7 +13,7 @@ This README file only contains basic setup instructions. ## Building Spark requires Scala 2.10. The project is built using Simple Build Tool (SBT), -which can be obtained from [here](http://www.scala-sbt.org). To build Spark and its example programs, run: +which can be obtained [here](http://www.scala-sbt.org). To build Spark and its example programs, run: sbt assembly @@ -38,24 +38,11 @@ locally with one thread, or "local[N]" to run locally with N threads. ## Running tests -### With sbt (Much faster to run compared to maven) -Once you have built spark with `sbt assembly` mentioned in [Building](#Building) section. Test suits can be run as follows using sbt. +Testing first requires [Building](#Building) Spark. Once Spark is built, tests +can be run using: `sbt test` -### With maven. -1. Export these necessary environment variables as follows. - - `export SCALA_HOME=<scala distribution>` - - `export MAVEN_OPTS="-Xmx1512m -XX:MaxPermSize=512m"` - -2. Build assembly by -`mvn package -DskipTests` - -3. Run tests -`mvn test` - ## A Note About Hadoop Versions Spark uses the Hadoop core library to talk to HDFS and other Hadoop-supported |