aboutsummaryrefslogtreecommitdiff
path: root/docs/quick-start.md
diff options
context:
space:
mode:
authorPrashant Sharma <prashant.s@imaginea.com>2014-01-02 18:50:12 +0530
committerPrashant Sharma <prashant.s@imaginea.com>2014-01-02 18:50:12 +0530
commita3f90a2ecf14a01aa27fc95c133b1ff375583adb (patch)
treef6a67730b423122b1172682f659ad3344193e6bf /docs/quick-start.md
parent94b7a7fe37a4b1459bfdbece2a4162451d6a8ac2 (diff)
downloadspark-a3f90a2ecf14a01aa27fc95c133b1ff375583adb.tar.gz
spark-a3f90a2ecf14a01aa27fc95c133b1ff375583adb.tar.bz2
spark-a3f90a2ecf14a01aa27fc95c133b1ff375583adb.zip
pyspark -> bin/pyspark
Diffstat (limited to 'docs/quick-start.md')
-rw-r--r--docs/quick-start.md4
1 files changed, 2 insertions, 2 deletions
diff --git a/docs/quick-start.md b/docs/quick-start.md
index 912f332236..2fa2bd718b 100644
--- a/docs/quick-start.md
+++ b/docs/quick-start.md
@@ -277,11 +277,11 @@ We can pass Python functions to Spark, which are automatically serialized along
For applications that use custom classes or third-party libraries, we can add those code dependencies to SparkContext to ensure that they will be available on remote machines; this is described in more detail in the [Python programming guide](python-programming-guide.html).
`SimpleApp` is simple enough that we do not need to specify any code dependencies.
-We can run this application using the `pyspark` script:
+We can run this application using the `bin/pyspark` script:
{% highlight python %}
$ cd $SPARK_HOME
-$ ./pyspark SimpleApp.py
+$ ./bin/pyspark SimpleApp.py
...
Lines with a: 46, Lines with b: 23
{% endhighlight python %}