aboutsummaryrefslogtreecommitdiff
path: root/docs/scala-programming-guide.md
diff options
context:
space:
mode:
authorMatei Zaharia <matei@eecs.berkeley.edu>2012-10-12 14:40:07 -0700
committerMatei Zaharia <matei@eecs.berkeley.edu>2012-10-12 14:40:07 -0700
commit1183b309419bb3efb17ef77cf491597fb0849706 (patch)
treeabd774558832db9b5b0baef5a7e17e81e83c1e0f /docs/scala-programming-guide.md
parent603b419fdfc710bb1a78565e22cba1f890039b5c (diff)
parent23015ccac045dd0e2c95c8625ee354984a8d594c (diff)
downloadspark-1183b309419bb3efb17ef77cf491597fb0849706.tar.gz
spark-1183b309419bb3efb17ef77cf491597fb0849706.tar.bz2
spark-1183b309419bb3efb17ef77cf491597fb0849706.zip
Merge branch 'dev' of github.com:mesos/spark into dev
Diffstat (limited to 'docs/scala-programming-guide.md')
-rw-r--r--docs/scala-programming-guide.md8
1 files changed, 7 insertions, 1 deletions
diff --git a/docs/scala-programming-guide.md b/docs/scala-programming-guide.md
index 8c084528d7..73f8b123be 100644
--- a/docs/scala-programming-guide.md
+++ b/docs/scala-programming-guide.md
@@ -17,7 +17,13 @@ This guide shows each of these features and walks through some samples. It assum
# Linking with Spark
-To write a Spark application, you will need to add both Spark and its dependencies to your CLASSPATH. The easiest way to do this is to run `sbt/sbt assembly` to build both Spark and its dependencies into one JAR (`core/target/spark-core-assembly-0.6.0.jar`), then add this to your CLASSPATH. Alternatively, you can publish Spark to the Maven cache on your machine using `sbt/sbt publish-local`. It will be an artifact called `spark-core` under the organization `org.spark-project`.
+To write a Spark application, you will need to add both Spark and its dependencies to your CLASSPATH. If you use sbt or Maven, Spark is available through Maven Central at:
+
+ groupId = org.spark_project
+ artifactId = spark-core_{{site.SCALA_VERSION}}
+ version = {{site.SPARK_VERSION}}
+
+For other build systems or environments, you can run `sbt/sbt assembly` to build both Spark and its dependencies into one JAR (`core/target/spark-core-assembly-0.6.0.jar`), then add this to your CLASSPATH.
In addition, you'll need to import some Spark classes and implicit conversions. Add the following lines at the top of your program: