aboutsummaryrefslogtreecommitdiff
path: root/docs
diff options
context:
space:
mode:
authorPatrick Wendell <pwendell@gmail.com>2013-08-11 20:33:58 -0700
committerPatrick Wendell <pwendell@gmail.com>2013-08-11 20:33:58 -0700
commit92445241463abf2cc764aab2f79d6f0b54b8b42e (patch)
tree84bf4420f99a79c90a0a10722208c6ef84211fdd /docs
parente5b9ed2833911cb894cf7ad05299aa1385a7e600 (diff)
downloadspark-92445241463abf2cc764aab2f79d6f0b54b8b42e.tar.gz
spark-92445241463abf2cc764aab2f79d6f0b54b8b42e.tar.bz2
spark-92445241463abf2cc764aab2f79d6f0b54b8b42e.zip
Removing dead docs
Diffstat (limited to 'docs')
-rw-r--r--docs/spark-simple-tutorial.md41
1 files changed, 0 insertions, 41 deletions
diff --git a/docs/spark-simple-tutorial.md b/docs/spark-simple-tutorial.md
deleted file mode 100644
index fbdbc7d19d..0000000000
--- a/docs/spark-simple-tutorial.md
+++ /dev/null
@@ -1,41 +0,0 @@
----
-layout: global
-title: Tutorial - Running a Simple Spark Application
----
-
-1. Create directory for spark demo:
-
- ~$ mkdir SparkTest
-
-2. Copy the sbt files in ~/spark/sbt directory:
-
- ~/SparkTest$ cp -r ../spark/sbt .
-
-3. Edit the ~/SparkTest/sbt/sbt file to look like this:
-
- #!/usr/bin/env bash
- java -Xmx800M -XX:MaxPermSize=150m -jar $(dirname $0)/sbt-launch-*.jar "$@"
-
-4. To build a Spark application, you need Spark and its dependencies in a single Java archive (JAR) file. Create this JAR in Spark's main directory with sbt as:
-
- ~/spark$ sbt/sbt assembly
-
-5. create a source file in ~/SparkTest/src/main/scala directory:
-
- ~/SparkTest/src/main/scala$ vi Test1.scala
-
-6. Make the contain of the Test1.scala file like this:
-
- import spark.SparkContext
- import spark.SparkContext._
- object Test1 {
- def main(args: Array[String]) {
- val sc = new SparkContext("local", "SparkTest")
- println(sc.parallelize(1 to 10).reduce(_ + _))
- System.exit(0)
- }
- }
-
-7. Run the Test1.scala file:
-
- ~/SparkTest$ sbt/sbt run