diff options
author | Matei Zaharia <matei@eecs.berkeley.edu> | 2012-10-09 14:30:23 -0700 |
---|---|---|
committer | Matei Zaharia <matei@eecs.berkeley.edu> | 2012-10-09 14:30:23 -0700 |
commit | bc0bc672d02e8f5f12cd1e14863db36c42acff96 (patch) | |
tree | 826f2673c093d3a982cfe6f96242725ff0a2089f /docs/scala-programming-guide.md | |
parent | ad28aebb0adfe3710bfcf741fbc9105282ee67a8 (diff) | |
download | spark-bc0bc672d02e8f5f12cd1e14863db36c42acff96.tar.gz spark-bc0bc672d02e8f5f12cd1e14863db36c42acff96.tar.bz2 spark-bc0bc672d02e8f5f12cd1e14863db36c42acff96.zip |
Updates to documentation:
- Edited quick start and tuning guide to simplify them a little
- Simplified top menu bar
- Made private a SparkContext constructor parameter that was left as
public
- Various small fixes
Diffstat (limited to 'docs/scala-programming-guide.md')
-rw-r--r-- | docs/scala-programming-guide.md | 8 |
1 files changed, 6 insertions, 2 deletions
diff --git a/docs/scala-programming-guide.md b/docs/scala-programming-guide.md index 76a1957efa..57a2c04b16 100644 --- a/docs/scala-programming-guide.md +++ b/docs/scala-programming-guide.md @@ -1,6 +1,6 @@ --- layout: global -title: Spark Scala Programming Guide +title: Scala Programming Guide --- * This will become a table of contents (this text will be scraped). @@ -37,7 +37,11 @@ new SparkContext(master, jobName, [sparkHome], [jars]) The `master` parameter is a string specifying a [Mesos](running-on-mesos.html) cluster to connect to, or a special "local" string to run in local mode, as described below. `jobName` is a name for your job, which will be shown in the Mesos web UI when running on a cluster. Finally, the last two parameters are needed to deploy your code to a cluster if running in distributed mode, as described later. -In the Spark interpreter, a special interpreter-aware SparkContext is already created for you, in the variable called `sc`. Making your own SparkContext will not work. You can set which master the context connects to using the `MASTER` environment variable. For example, run `MASTER=local[4] ./spark-shell` to run locally with four cores. +In the Spark shell, a special interpreter-aware SparkContext is already created for you, in the variable called `sc`. Making your own SparkContext will not work. You can set which master the context connects to using the `MASTER` environment variable. For example, to run on four cores, use + +{% highlight bash %} +$ MASTER=local[4] ./spark-shell +{% endhighlight %} ### Master URLs |