diff options
author | Patrick Wendell <pwendell@gmail.com> | 2012-10-09 22:37:58 -0700 |
---|---|---|
committer | Patrick Wendell <pwendell@gmail.com> | 2012-10-09 22:39:29 -0700 |
commit | 0f760a0bd372e4d1c69b0cbd03c81348330fa609 (patch) | |
tree | 8135c80e402bd4559841e5d14dfaec47b8090039 /docs/scala-programming-guide.md | |
parent | 4de5cc1ad43cc50b8610913f60916899a7fd75ad (diff) | |
download | spark-0f760a0bd372e4d1c69b0cbd03c81348330fa609.tar.gz spark-0f760a0bd372e4d1c69b0cbd03c81348330fa609.tar.bz2 spark-0f760a0bd372e4d1c69b0cbd03c81348330fa609.zip |
Updating programming guide with new link instructions
Diffstat (limited to 'docs/scala-programming-guide.md')
-rw-r--r-- | docs/scala-programming-guide.md | 8 |
1 files changed, 7 insertions, 1 deletions
diff --git a/docs/scala-programming-guide.md b/docs/scala-programming-guide.md index 57a2c04b16..49225fbec8 100644 --- a/docs/scala-programming-guide.md +++ b/docs/scala-programming-guide.md @@ -17,7 +17,13 @@ This guide shows each of these features and walks through some samples. It assum # Linking with Spark -To write a Spark application, you will need to add both Spark and its dependencies to your CLASSPATH. The easiest way to do this is to run `sbt/sbt assembly` to build both Spark and its dependencies into one JAR (`core/target/spark-core-assembly-0.6.0.jar`), then add this to your CLASSPATH. Alternatively, you can publish Spark to the Maven cache on your machine using `sbt/sbt publish-local`. It will be an artifact called `spark-core` under the organization `org.spark-project`. +To write a Spark application, you will need to add both Spark and its dependencies to your CLASSPATH. If you use sbt or Maven, Spark is available through Maven Central at: + + groupId = org.spark_project + artifactId = spark-core_{{site.SCALA_VERSION}} + version = {{site.SPARK_VERSION}} + +For other build systems or environments, you can run `sbt/sbt assembly` to build both Spark and its dependencies into one JAR (`core/target/spark-core-assembly-0.6.0.jar`), then add this to your CLASSPATH. In addition, you'll need to import some Spark classes and implicit conversions. Add the following lines at the top of your program: |