aboutsummaryrefslogtreecommitdiff
path: root/python/docs/index.rst
diff options
context:
space:
mode:
authorDavies Liu <davies.liu@gmail.com>2014-09-16 12:51:58 -0700
committerJosh Rosen <joshrosen@apache.org>2014-09-16 12:51:58 -0700
commitec1adecbb72d291d7ef122fb0505bae53116e0e6 (patch)
treea61931ca6e78016fbaae5c4b75c97a35c47fde22 /python/docs/index.rst
parenta9e910430fb6bb4ef1f6ae20761c43b96bb018df (diff)
downloadspark-ec1adecbb72d291d7ef122fb0505bae53116e0e6.tar.gz
spark-ec1adecbb72d291d7ef122fb0505bae53116e0e6.tar.bz2
spark-ec1adecbb72d291d7ef122fb0505bae53116e0e6.zip
[SPARK-3430] [PySpark] [Doc] generate PySpark API docs using Sphinx
Using Sphinx to generate API docs for PySpark. requirement: Sphinx ``` $ cd python/docs/ $ make html ``` The generated API docs will be located at python/docs/_build/html/index.html It can co-exists with those generated by Epydoc. This is the first working version, after merging in, then we can continue to improve it and replace the epydoc finally. Author: Davies Liu <davies.liu@gmail.com> Closes #2292 from davies/sphinx and squashes the following commits: 425a3b1 [Davies Liu] cleanup 1573298 [Davies Liu] move docs to python/docs/ 5fe3903 [Davies Liu] Merge branch 'master' into sphinx 9468ab0 [Davies Liu] fix makefile b408f38 [Davies Liu] address all comments e2ccb1b [Davies Liu] update name and version 9081ead [Davies Liu] generate PySpark API docs using Sphinx
Diffstat (limited to 'python/docs/index.rst')
-rw-r--r--python/docs/index.rst37
1 files changed, 37 insertions, 0 deletions
diff --git a/python/docs/index.rst b/python/docs/index.rst
new file mode 100644
index 0000000000..25b3f9bd93
--- /dev/null
+++ b/python/docs/index.rst
@@ -0,0 +1,37 @@
+.. pyspark documentation master file, created by
+ sphinx-quickstart on Thu Aug 28 15:17:47 2014.
+ You can adapt this file completely to your liking, but it should at least
+ contain the root `toctree` directive.
+
+Welcome to PySpark API reference!
+===================================
+
+Contents:
+
+.. toctree::
+ :maxdepth: 2
+
+ pyspark
+ pyspark.sql
+ pyspark.mllib
+
+
+Core classes:
+---------------
+
+ :class:`pyspark.SparkContext`
+
+ Main entry point for Spark functionality.
+
+ :class:`pyspark.RDD`
+
+ A Resilient Distributed Dataset (RDD), the basic abstraction in Spark.
+
+
+Indices and tables
+==================
+
+* :ref:`genindex`
+* :ref:`modindex`
+* :ref:`search`
+