diff options
author | Reynold Xin <rxin@cs.berkeley.edu> | 2013-08-11 20:35:09 -0700 |
---|---|---|
committer | Reynold Xin <rxin@cs.berkeley.edu> | 2013-08-11 20:35:09 -0700 |
commit | 2a39d2ca25491a44016227a9851b7f0c8d783244 (patch) | |
tree | 84bf4420f99a79c90a0a10722208c6ef84211fdd | |
parent | e5b9ed2833911cb894cf7ad05299aa1385a7e600 (diff) | |
parent | 92445241463abf2cc764aab2f79d6f0b54b8b42e (diff) | |
download | spark-2a39d2ca25491a44016227a9851b7f0c8d783244.tar.gz spark-2a39d2ca25491a44016227a9851b7f0c8d783244.tar.bz2 spark-2a39d2ca25491a44016227a9851b7f0c8d783244.zip |
Merge pull request #810 from pwendell/dead_doc_code
Remove now dead code inside of docs
-rw-r--r-- | docs/spark-simple-tutorial.md | 41 |
1 files changed, 0 insertions, 41 deletions
diff --git a/docs/spark-simple-tutorial.md b/docs/spark-simple-tutorial.md deleted file mode 100644 index fbdbc7d19d..0000000000 --- a/docs/spark-simple-tutorial.md +++ /dev/null @@ -1,41 +0,0 @@ ---- -layout: global -title: Tutorial - Running a Simple Spark Application ---- - -1. Create directory for spark demo: - - ~$ mkdir SparkTest - -2. Copy the sbt files in ~/spark/sbt directory: - - ~/SparkTest$ cp -r ../spark/sbt . - -3. Edit the ~/SparkTest/sbt/sbt file to look like this: - - #!/usr/bin/env bash - java -Xmx800M -XX:MaxPermSize=150m -jar $(dirname $0)/sbt-launch-*.jar "$@" - -4. To build a Spark application, you need Spark and its dependencies in a single Java archive (JAR) file. Create this JAR in Spark's main directory with sbt as: - - ~/spark$ sbt/sbt assembly - -5. create a source file in ~/SparkTest/src/main/scala directory: - - ~/SparkTest/src/main/scala$ vi Test1.scala - -6. Make the contain of the Test1.scala file like this: - - import spark.SparkContext - import spark.SparkContext._ - object Test1 { - def main(args: Array[String]) { - val sc = new SparkContext("local", "SparkTest") - println(sc.parallelize(1 to 10).reduce(_ + _)) - System.exit(0) - } - } - -7. Run the Test1.scala file: - - ~/SparkTest$ sbt/sbt run |