aboutsummaryrefslogtreecommitdiff
diff options
context:
space:
mode:
-rw-r--r--.gitignore2
-rw-r--r--README.md4
-rwxr-xr-xsbt/sbt2
3 files changed, 4 insertions, 4 deletions
diff --git a/.gitignore b/.gitignore
index 1692bde20f..39635d7eef 100644
--- a/.gitignore
+++ b/.gitignore
@@ -4,7 +4,7 @@
*.iml
*.iws
.idea/
-.sbtlib/*.jar
+sbt/*.jar
.settings
.cache
/build/
diff --git a/README.md b/README.md
index db1e2c4c0a..2c08a4ac63 100644
--- a/README.md
+++ b/README.md
@@ -15,7 +15,7 @@ This README file only contains basic setup instructions.
Spark requires Scala 2.10. The project is built using Simple Build Tool (SBT),
which can be obtained [here](http://www.scala-sbt.org). To build Spark and its example programs, run:
- ./sbt assembly
+ ./sbt/sbt assembly
Once you've built Spark, the easiest way to start using it is the shell:
@@ -41,7 +41,7 @@ locally with one thread, or "local[N]" to run locally with N threads.
Testing first requires [Building](#Building) Spark. Once Spark is built, tests
can be run using:
-`./sbt test`
+`./sbt/sbt test`
## A Note About Hadoop Versions
diff --git a/sbt/sbt b/sbt/sbt
index d21806ed83..a7146e3b05 100755
--- a/sbt/sbt
+++ b/sbt/sbt
@@ -5,7 +5,7 @@
SBT_VERSION=0.12.4
URL1=http://typesafe.artifactoryonline.com/typesafe/ivy-releases/org.scala-sbt/sbt-launch/${SBT_VERSION}/sbt-launch.jar
URL2=http://repo.typesafe.com/typesafe/ivy-releases/org.scala-sbt/sbt-launch/${SBT_VERSION}/sbt-launch.jar
-JAR=.sbtlib/sbt-launch-${SBT_VERSION}.jar
+JAR=sbt/sbt-launch-${SBT_VERSION}.jar
printf "Checking for system sbt ["
if hash sbt 2>/dev/null; then