diff options
author | Holden Karau <holden@pigscanfly.ca> | 2014-01-04 20:17:30 -0800 |
---|---|---|
committer | Holden Karau <holden@pigscanfly.ca> | 2014-01-04 20:17:30 -0800 |
commit | b4a1ffc6c2634118bb1d07216221b862c32d6397 (patch) | |
tree | efecbc7b1c8f62ca2c857aef54e30c1923bdbac4 /README.md | |
parent | 97123be1d7d1b68ec0cda09fd5894fc4af5f82c5 (diff) | |
download | spark-b4a1ffc6c2634118bb1d07216221b862c32d6397.tar.gz spark-b4a1ffc6c2634118bb1d07216221b862c32d6397.tar.bz2 spark-b4a1ffc6c2634118bb1d07216221b862c32d6397.zip |
Switch from sbt to ./sbt in the README file
Diffstat (limited to 'README.md')
-rw-r--r-- | README.md | 4 |
1 files changed, 2 insertions, 2 deletions
@@ -15,7 +15,7 @@ This README file only contains basic setup instructions. Spark requires Scala 2.10. The project is built using Simple Build Tool (SBT), which can be obtained [here](http://www.scala-sbt.org). To build Spark and its example programs, run: - sbt assembly + ./sbt assembly Once you've built Spark, the easiest way to start using it is the shell: @@ -41,7 +41,7 @@ locally with one thread, or "local[N]" to run locally with N threads. Testing first requires [Building](#Building) Spark. Once Spark is built, tests can be run using: -`sbt test` +`./sbt test` ## A Note About Hadoop Versions |