summaryrefslogtreecommitdiff
path: root/site/docs
diff options
context:
space:
mode:
authorXiangrui Meng <meng@apache.org>2015-01-15 03:08:01 +0000
committerXiangrui Meng <meng@apache.org>2015-01-15 03:08:01 +0000
commitfdd9397bbe13ff134c1a3da6cf13ed379e92ce2d (patch)
tree92d87ed52ecbbf5ae4f498723991f3acde532e55 /site/docs
parent170b0d4200067d9015a72f5e116b9fe6f8fb4cd1 (diff)
downloadspark-website-fdd9397bbe13ff134c1a3da6cf13ed379e92ce2d.tar.gz
spark-website-fdd9397bbe13ff134c1a3da6cf13ed379e92ce2d.tar.bz2
spark-website-fdd9397bbe13ff134c1a3da6cf13ed379e92ce2d.zip
remove developers section from spark.ml guide (SPARK-5254)
Diffstat (limited to 'site/docs')
-rw-r--r--site/docs/1.2.0/ml-guide.html15
1 files changed, 0 insertions, 15 deletions
diff --git a/site/docs/1.2.0/ml-guide.html b/site/docs/1.2.0/ml-guide.html
index 654809e93..58fa4d62a 100644
--- a/site/docs/1.2.0/ml-guide.html
+++ b/site/docs/1.2.0/ml-guide.html
@@ -160,7 +160,6 @@ to <code>spark.ml</code>.</p>
</ul>
</li>
<li><a href="#dependencies">Dependencies</a></li>
- <li><a href="#developers">Developers</a></li>
</ul>
<h1 id="main-concepts">Main Concepts</h1>
@@ -838,20 +837,6 @@ Please see the <a href="mllib-guide.html#Dependencies">MLlib Dependencies guide<
<p>Spark ML also depends upon Spark SQL, but the relevant parts of Spark SQL do not bring additional dependencies.</p>
-<h1 id="developers">Developers</h1>
-
-<p><strong>Development plan</strong></p>
-
-<p>If all goes well, <code>spark.ml</code> will become the primary ML package at the time of the Spark 1.3 release. Initially, simple wrappers will be used to port algorithms to <code>spark.ml</code>, but eventually, code will be moved to <code>spark.ml</code> and <code>spark.mllib</code> will be deprecated.</p>
-
-<p><strong>Advice to developers</strong></p>
-
-<p>During the next development cycle, new algorithms should be contributed to <code>spark.mllib</code>, but we welcome patches sent to either package. If an algorithm is best expressed using the new API (e.g., feature transformers), we may ask for developers to use the new <code>spark.ml</code> API.
-Wrappers for old and new algorithms can be contributed to <code>spark.ml</code>.</p>
-
-<p>Users will be able to use algorithms from either of the two packages. The main difficulty will be the differences in APIs between the two packages.</p>
-
-
</div> <!-- /container -->