aboutsummaryrefslogtreecommitdiff
path: root/docs
diff options
context:
space:
mode:
authorPatrick Wendell <pwendell@gmail.com>2012-10-09 22:38:19 -0700
committerPatrick Wendell <pwendell@gmail.com>2012-10-09 22:39:28 -0700
commit8321e7f0c2d95f7b382293a4208dbf8cd2fe7809 (patch)
tree213e69dc8d797331d3038fca10f6257b64eeecba /docs
parent5013c785fdb99dddb1e2344de39097d4645e1ff5 (diff)
downloadspark-8321e7f0c2d95f7b382293a4208dbf8cd2fe7809.tar.gz
spark-8321e7f0c2d95f7b382293a4208dbf8cd2fe7809.tar.bz2
spark-8321e7f0c2d95f7b382293a4208dbf8cd2fe7809.zip
Fixing YARN instructions
Diffstat (limited to 'docs')
-rw-r--r--docs/running-on-yarn.md6
1 files changed, 3 insertions, 3 deletions
diff --git a/docs/running-on-yarn.md b/docs/running-on-yarn.md
index dd094ab131..6fb81b6004 100644
--- a/docs/running-on-yarn.md
+++ b/docs/running-on-yarn.md
@@ -19,7 +19,7 @@ branch of Spark, called `yarn`, which you can do as follows:
- In order to distribute Spark within the cluster, it must be packaged into a single JAR file. This can be done by running `sbt/sbt assembly`
- Your application code must be packaged into a separate JAR file.
-If you want to test out the YARN deployment mode, you can use the current Spark examples. A `spark-examples_{{site.SCALA_VERSION}}-{{site.SPARK_VERSION}}-SNAPSHOT.jar` file can be generated by running `sbt/sbt package`. NOTE: since the documentation you're reading is for Spark version {{site.SPARK_VERSION}}, we are assuming here that you have downloaded Spark {{site.SPARK_VERSION}} or checked it out of source control. If you are using a different version of Spark, the version numbers in the jar generated by the sbt package command will obviously be different.
+If you want to test out the YARN deployment mode, you can use the current Spark examples. A `spark-examples_{{site.SCALA_VERSION}}-{{site.SPARK_VERSION}}` file can be generated by running `sbt/sbt package`. NOTE: since the documentation you're reading is for Spark version {{site.SPARK_VERSION}}, we are assuming here that you have downloaded Spark {{site.SPARK_VERSION}} or checked it out of source control. If you are using a different version of Spark, the version numbers in the jar generated by the sbt package command will obviously be different.
# Launching Spark on YARN
@@ -35,8 +35,8 @@ The command to launch the YARN Client is as follows:
For example:
- SPARK_JAR=./core/target/spark-core-assembly-{{site.SPARK_VERSION}}-SNAPSHOT.jar ./run spark.deploy.yarn.Client \
- --jar examples/target/scala-{{site.SCALA_VERSION}}/spark-examples_{{site.SCALA_VERSION}}-{{site.SPARK_VERSION}}-SNAPSHOT.jar \
+ SPARK_JAR=./core/target/spark-core-assembly-{{site.SPARK_VERSION}}.jar ./run spark.deploy.yarn.Client \
+ --jar examples/target/scala-{{site.SCALA_VERSION}}/spark-examples_{{site.SCALA_VERSION}}-{{site.SPARK_VERSION}}.jar \
--class spark.examples.SparkPi \
--args standalone \
--num-workers 3 \