summaryrefslogtreecommitdiff
path: root/site/docs/1.0.0/index.html
diff options
context:
space:
mode:
authorPatrick Wendell <pwendell@apache.org>2014-05-30 08:55:36 +0000
committerPatrick Wendell <pwendell@apache.org>2014-05-30 08:55:36 +0000
commit66fb4b11cbae79d9044b2bbf2e53351642a58ff5 (patch)
tree2a3238ae425816d2296c2972987e56b3e949ea0b /site/docs/1.0.0/index.html
parent3936fd5d223549bfa06bd6eeba6becfc88daee52 (diff)
downloadspark-website-66fb4b11cbae79d9044b2bbf2e53351642a58ff5.tar.gz
spark-website-66fb4b11cbae79d9044b2bbf2e53351642a58ff5.tar.bz2
spark-website-66fb4b11cbae79d9044b2bbf2e53351642a58ff5.zip
Docs for Spark 1.0.0
Diffstat (limited to 'site/docs/1.0.0/index.html')
-rw-r--r--site/docs/1.0.0/index.html283
1 files changed, 283 insertions, 0 deletions
diff --git a/site/docs/1.0.0/index.html b/site/docs/1.0.0/index.html
new file mode 100644
index 000000000..00d836871
--- /dev/null
+++ b/site/docs/1.0.0/index.html
@@ -0,0 +1,283 @@
+<!DOCTYPE html>
+<!--[if lt IE 7]> <html class="no-js lt-ie9 lt-ie8 lt-ie7"> <![endif]-->
+<!--[if IE 7]> <html class="no-js lt-ie9 lt-ie8"> <![endif]-->
+<!--[if IE 8]> <html class="no-js lt-ie9"> <![endif]-->
+<!--[if gt IE 8]><!--> <html class="no-js"> <!--<![endif]-->
+ <head>
+ <meta charset="utf-8">
+ <meta http-equiv="X-UA-Compatible" content="IE=edge,chrome=1">
+ <title>Spark Overview - Spark 1.0.0 Documentation</title>
+ <meta name="description" content="">
+
+
+
+ <link rel="stylesheet" href="css/bootstrap.min.css">
+ <style>
+ body {
+ padding-top: 60px;
+ padding-bottom: 40px;
+ }
+ </style>
+ <meta name="viewport" content="width=device-width">
+ <link rel="stylesheet" href="css/bootstrap-responsive.min.css">
+ <link rel="stylesheet" href="css/main.css">
+
+ <script src="js/vendor/modernizr-2.6.1-respond-1.1.0.min.js"></script>
+
+ <link rel="stylesheet" href="css/pygments-default.css">
+
+
+
+ </head>
+ <body>
+ <!--[if lt IE 7]>
+ <p class="chromeframe">You are using an outdated browser. <a href="http://browsehappy.com/">Upgrade your browser today</a> or <a href="http://www.google.com/chromeframe/?redirect=true">install Google Chrome Frame</a> to better experience this site.</p>
+ <![endif]-->
+
+ <!-- This code is taken from http://twitter.github.com/bootstrap/examples/hero.html -->
+
+ <div class="navbar navbar-fixed-top" id="topbar">
+ <div class="navbar-inner">
+ <div class="container">
+ <div class="brand"><a href="index.html">
+ <img src="img/spark-logo-hd.png" style="height:50px;"/></a><span class="version">1.0.0</span>
+ </div>
+ <ul class="nav">
+ <!--TODO(andyk): Add class="active" attribute to li some how.-->
+ <li><a href="index.html">Overview</a></li>
+
+ <li class="dropdown">
+ <a href="#" class="dropdown-toggle" data-toggle="dropdown">Programming Guides<b class="caret"></b></a>
+ <ul class="dropdown-menu">
+ <li><a href="quick-start.html">Quick Start</a></li>
+ <li><a href="programming-guide.html">Spark Programming Guide</a></li>
+ <li class="divider"></li>
+ <li><a href="streaming-programming-guide.html">Spark Streaming</a></li>
+ <li><a href="sql-programming-guide.html">Spark SQL</a></li>
+ <li><a href="mllib-guide.html">MLlib (Machine Learning)</a></li>
+ <li><a href="graphx-programming-guide.html">GraphX (Graph Processing)</a></li>
+ <li><a href="bagel-programming-guide.html">Bagel (Pregel on Spark)</a></li>
+ </ul>
+ </li>
+
+ <li class="dropdown">
+ <a href="#" class="dropdown-toggle" data-toggle="dropdown">API Docs<b class="caret"></b></a>
+ <ul class="dropdown-menu">
+ <li><a href="api/scala/index.html#org.apache.spark.package">Scaladoc</a></li>
+ <li><a href="api/java/index.html">Javadoc</a></li>
+ <li><a href="api/python/index.html">Python API</a></li>
+ </ul>
+ </li>
+
+ <li class="dropdown">
+ <a href="#" class="dropdown-toggle" data-toggle="dropdown">Deploying<b class="caret"></b></a>
+ <ul class="dropdown-menu">
+ <li><a href="cluster-overview.html">Overview</a></li>
+ <li><a href="submitting-applications.html">Submitting Applications</a></li>
+ <li class="divider"></li>
+ <li><a href="ec2-scripts.html">Amazon EC2</a></li>
+ <li><a href="spark-standalone.html">Standalone Mode</a></li>
+ <li><a href="running-on-mesos.html">Mesos</a></li>
+ <li><a href="running-on-yarn.html">YARN</a></li>
+ </ul>
+ </li>
+
+ <li class="dropdown">
+ <a href="api.html" class="dropdown-toggle" data-toggle="dropdown">More<b class="caret"></b></a>
+ <ul class="dropdown-menu">
+ <li><a href="configuration.html">Configuration</a></li>
+ <li><a href="monitoring.html">Monitoring</a></li>
+ <li><a href="tuning.html">Tuning Guide</a></li>
+ <li><a href="job-scheduling.html">Job Scheduling</a></li>
+ <li><a href="security.html">Security</a></li>
+ <li><a href="hardware-provisioning.html">Hardware Provisioning</a></li>
+ <li><a href="hadoop-third-party-distributions.html">3<sup>rd</sup>-Party Hadoop Distros</a></li>
+ <li class="divider"></li>
+ <li><a href="building-with-maven.html">Building Spark with Maven</a></li>
+ <li><a href="https://cwiki.apache.org/confluence/display/SPARK/Contributing+to+Spark">Contributing to Spark</a></li>
+ </ul>
+ </li>
+ </ul>
+ <!--<p class="navbar-text pull-right"><span class="version-text">v1.0.0</span></p>-->
+ </div>
+ </div>
+ </div>
+
+ <div class="container" id="content">
+
+ <h1 class="title">Spark Overview</h1>
+
+
+ <p>Apache Spark is a fast and general-purpose cluster computing system.
+It provides high-level APIs in Java, Scala and Python,
+and an optimized engine that supports general execution graphs.
+It also supports a rich set of higher-level tools including <a href="http://shark.cs.berkeley.edu">Shark</a> (Hive on Spark), <a href="sql-programming-guide.html">Spark SQL</a> for structured data, <a href="mllib-guide.html">MLlib</a> for machine learning, <a href="graphx-programming-guide.html">GraphX</a> for graph processing, and <a href="streaming-programming-guide.html">Spark Streaming</a>.</p>
+
+<h1 id="downloading">Downloading</h1>
+
+<p>Get Spark from the <a href="http://spark.apache.org/downloads.html">downloads page</a> of the project website. This documentation is for Spark version 1.0.0. The downloads page
+contains Spark packages for many popular HDFS versions. If you&#8217;d like to build Spark from
+scratch, visit <a href="building-with-maven.html">building Spark with Maven</a>.</p>
+
+<p>Spark runs on both Windows and UNIX-like systems (e.g. Linux, Mac OS). It&#8217;s easy to run
+locally on one machine &#8212; all you need is to have <code>java</code> installed on your system <code>PATH</code>,
+or the <code>JAVA_HOME</code> environment variable pointing to a Java installation.</p>
+
+<p>Spark runs on Java 6+ and Python 2.6+. For the Scala API, Spark 1.0.0 uses
+Scala 2.10. You will need to use a compatible Scala version
+(2.10.x).</p>
+
+<h1 id="running-the-examples-and-shell">Running the Examples and Shell</h1>
+
+<p>Spark comes with several sample programs. Scala, Java and Python examples are in the
+<code>examples/src/main</code> directory. To run one of the Java or Scala sample programs, use
+<code>bin/run-example &lt;class&gt; [params]</code> in the top-level Spark directory. (Behind the scenes, this
+invokes the more general
+<a href="submitting-applications.html"><code>spark-submit</code> script</a> for
+launching applications). For example,</p>
+
+<pre><code>./bin/run-example SparkPi 10
+</code></pre>
+
+<p>You can also run Spark interactively through a modified version of the Scala shell. This is a
+great way to learn the framework.</p>
+
+<pre><code>./bin/spark-shell --master local[2]
+</code></pre>
+
+<p>The <code>--master</code> option specifies the
+<a href="submitting-applications.html#master-urls">master URL for a distributed cluster</a>, or <code>local</code> to run
+locally with one thread, or <code>local[N]</code> to run locally with N threads. You should start by using
+<code>local</code> for testing. For a full list of options, run Spark shell with the <code>--help</code> option.</p>
+
+<p>Spark also provides a Python API. To run Spark interactively in a Python interpreter, use
+<code>bin/pyspark</code>:</p>
+
+<pre><code>./bin/pyspark --master local[2]
+</code></pre>
+
+<p>Example applications are also provided in Python. For example,</p>
+
+<pre><code>./bin/spark-submit examples/src/main/python/pi.py 10
+</code></pre>
+
+<h1 id="launching-on-a-cluster">Launching on a Cluster</h1>
+
+<p>The Spark <a href="cluster-overview.html">cluster mode overview</a> explains the key concepts in running on a cluster.
+Spark can run both by itself, or over several existing cluster managers. It currently provides several
+options for deployment:</p>
+
+<ul>
+ <li><a href="ec2-scripts.html">Amazon EC2</a>: our EC2 scripts let you launch a cluster in about 5 minutes</li>
+ <li><a href="spark-standalone.html">Standalone Deploy Mode</a>: simplest way to deploy Spark on a private cluster</li>
+ <li><a href="running-on-mesos.html">Apache Mesos</a></li>
+ <li><a href="running-on-yarn.html">Hadoop YARN</a></li>
+</ul>
+
+<h1 id="where-to-go-from-here">Where to Go from Here</h1>
+
+<p><strong>Programming Guides:</strong></p>
+
+<ul>
+ <li><a href="quick-start.html">Quick Start</a>: a quick introduction to the Spark API; start here!</li>
+ <li><a href="programming-guide.html">Spark Programming Guide</a>: detailed overview of Spark
+in all supported languages (Scala, Java, Python)</li>
+ <li>Modules built on Spark:
+ <ul>
+ <li><a href="streaming-programming-guide.html">Spark Streaming</a>: processing real-time data streams</li>
+ <li><a href="sql-programming-guide.html">Spark SQL</a>: support for structured data and relational queries</li>
+ <li><a href="mllib-guide.html">MLlib</a>: built-in machine learning library</li>
+ <li><a href="graphx-programming-guide.html">GraphX</a>: Spark&#8217;s new API for graph processing</li>
+ <li><a href="bagel-programming-guide.html">Bagel (Pregel on Spark)</a>: older, simple graph processing model</li>
+ </ul>
+ </li>
+</ul>
+
+<p><strong>API Docs:</strong></p>
+
+<ul>
+ <li><a href="api/scala/index.html#org.apache.spark.package">Spark Scala API (Scaladoc)</a></li>
+ <li><a href="api/java/index.html">Spark Java API (Javadoc)</a></li>
+ <li><a href="api/python/index.html">Spark Python API (Epydoc)</a></li>
+</ul>
+
+<p><strong>Deployment Guides:</strong></p>
+
+<ul>
+ <li><a href="cluster-overview.html">Cluster Overview</a>: overview of concepts and components when running on a cluster</li>
+ <li><a href="submitting-applications.html">Submitting Applications</a>: packaging and deploying applications</li>
+ <li>Deployment modes:
+ <ul>
+ <li><a href="ec2-scripts.html">Amazon EC2</a>: scripts that let you launch a cluster on EC2 in about 5 minutes</li>
+ <li><a href="spark-standalone.html">Standalone Deploy Mode</a>: launch a standalone cluster quickly without a third-party cluster manager</li>
+ <li><a href="running-on-mesos.html">Mesos</a>: deploy a private cluster using
+ <a href="http://mesos.apache.org">Apache Mesos</a></li>
+ <li><a href="running-on-yarn.html">YARN</a>: deploy Spark on top of Hadoop NextGen (YARN)</li>
+ </ul>
+ </li>
+</ul>
+
+<p><strong>Other Documents:</strong></p>
+
+<ul>
+ <li><a href="configuration.html">Configuration</a>: customize Spark via its configuration system</li>
+ <li><a href="monitoring.html">Monitoring</a>: track the behavior of your applications</li>
+ <li><a href="tuning.html">Tuning Guide</a>: best practices to optimize performance and memory use</li>
+ <li><a href="job-scheduling.html">Job Scheduling</a>: scheduling resources across and within Spark applications</li>
+ <li><a href="security.html">Security</a>: Spark security support</li>
+ <li><a href="hardware-provisioning.html">Hardware Provisioning</a>: recommendations for cluster hardware</li>
+ <li><a href="hadoop-third-party-distributions.html">3<sup>rd</sup> Party Hadoop Distributions</a>: using common Hadoop distributions</li>
+ <li><a href="building-with-maven.html">Building Spark with Maven</a>: build Spark using the Maven system</li>
+ <li><a href="https://cwiki.apache.org/confluence/display/SPARK/Contributing+to+Spark">Contributing to Spark</a></li>
+</ul>
+
+<p><strong>External Resources:</strong></p>
+
+<ul>
+ <li><a href="http://spark.apache.org">Spark Homepage</a></li>
+ <li><a href="http://shark.cs.berkeley.edu">Shark</a>: Apache Hive over Spark</li>
+ <li><a href="http://spark.apache.org/mailing-lists.html">Mailing Lists</a>: ask questions about Spark here</li>
+ <li><a href="http://ampcamp.berkeley.edu/">AMP Camps</a>: a series of training camps at UC Berkeley that featured talks and
+exercises about Spark, Shark, Spark Streaming, Mesos, and more. <a href="http://ampcamp.berkeley.edu/3/">Videos</a>,
+<a href="http://ampcamp.berkeley.edu/3/">slides</a> and <a href="http://ampcamp.berkeley.edu/3/exercises/">exercises</a> are
+available online for free.</li>
+ <li><a href="http://spark.apache.org/examples.html">Code Examples</a>: more are also available in the <code>examples</code> subfolder of Spark (<a href="https://github.com/apache/spark/tree/master/examples/src/main/scala/org/apache/spark/examples">Scala</a>,
+ <a href="https://github.com/apache/spark/tree/master/examples/src/main/java/org/apache/spark/examples">Java</a>,
+ <a href="https://github.com/apache/spark/tree/master/examples/src/main/python">Python</a>)</li>
+</ul>
+
+<h1 id="community">Community</h1>
+
+<p>To get help using Spark or keep up with Spark development, sign up for the <a href="http://spark.apache.org/mailing-lists.html">user mailing list</a>.</p>
+
+<p>If you&#8217;re in the San Francisco Bay Area, there&#8217;s a regular <a href="http://www.meetup.com/spark-users/">Spark meetup</a> every few weeks. Come by to meet the developers and other users.</p>
+
+<p>Finally, if you&#8217;d like to contribute code to Spark, read <a href="contributing-to-spark.html">how to contribute</a>.</p>
+
+
+ </div> <!-- /container -->
+
+ <script src="js/vendor/jquery-1.8.0.min.js"></script>
+ <script src="js/vendor/bootstrap.min.js"></script>
+ <script src="js/main.js"></script>
+
+ <!-- MathJax Section -->
+ <script type="text/x-mathjax-config">
+ MathJax.Hub.Config({
+ TeX: { equationNumbers: { autoNumber: "AMS" } }
+ });
+ </script>
+ <script type="text/javascript"
+ src="http://cdn.mathjax.org/mathjax/latest/MathJax.js?config=TeX-AMS-MML_HTMLorMML"></script>
+ <script>
+ MathJax.Hub.Config({
+ tex2jax: {
+ inlineMath: [ ["$", "$"], ["\\\\(","\\\\)"] ],
+ displayMath: [ ["$$","$$"], ["\\[", "\\]"] ],
+ processEscapes: true,
+ skipTags: ['script', 'noscript', 'style', 'textarea', 'pre']
+ }
+ });
+ </script>
+ </body>
+</html>