aboutsummaryrefslogtreecommitdiff
path: root/docs/quick-start.md
diff options
context:
space:
mode:
authorPatrick Wendell <pwendell@gmail.com>2013-09-01 09:38:56 -0700
committerPatrick Wendell <pwendell@gmail.com>2013-09-01 09:43:42 -0700
commit0e375a3cc280880c430da00838b7231b942f96d4 (patch)
treed2aaddc57a05e6d1962df07d832d45ed5d67f5d5 /docs/quick-start.md
parent6371febe189a6960401afd82985a793bf6c96a97 (diff)
downloadspark-0e375a3cc280880c430da00838b7231b942f96d4.tar.gz
spark-0e375a3cc280880c430da00838b7231b942f96d4.tar.bz2
spark-0e375a3cc280880c430da00838b7231b942f96d4.zip
Add assmebly plug in links
Diffstat (limited to 'docs/quick-start.md')
-rw-r--r--docs/quick-start.md15
1 files changed, 9 insertions, 6 deletions
diff --git a/docs/quick-start.md b/docs/quick-start.md
index 8cf4156f13..4507b21c5e 100644
--- a/docs/quick-start.md
+++ b/docs/quick-start.md
@@ -294,12 +294,15 @@ There are a few additional considerations when running jobs on a
### Including Your Dependencies
If your code depends on other projects, you will need to ensure they are also
-present on the slave nodes. The most common way to do this is to create an
-assembly jar (or "uber" jar) containing your code and its dependencies. You
-may then submit the assembly jar when creating a SparkContext object. If you
-do this, you should make Spark itself a `provided` dependency, since it will
-already be present on the slave nodes. It is also possible to submit your
-dependent jars one-by-one when creating a SparkContext.
+present on the slave nodes. A popular approach is to create an
+assembly jar (or "uber" jar) containing your code and its dependencies. Both
+[sbt](https://github.com/sbt/sbt-assembly) and
+[Maven](http://maven.apache.org/plugins/maven-assembly-plugin/)
+have assembly plugins. When creating assembly jars, list Spark
+itself as a `provided` dependency; it need not be bundled since it is
+already present on the slaves. Once you have an assembled jar,
+add it to the SparkContext as shown here. It is also possible to submit
+your dependent jars one-by-one when creating a SparkContext.
### Setting Configuration Options
Spark includes several configuration options which influence the behavior