aboutsummaryrefslogtreecommitdiff
path: root/docs/ml-guide.md
diff options
context:
space:
mode:
authorXiangrui Meng <meng@databricks.com>2015-01-14 18:54:17 -0800
committerXiangrui Meng <meng@databricks.com>2015-01-14 18:54:17 -0800
commit6abc45e340d3be5f07236adc104db5f8dda0d514 (patch)
treee058b80312d31e615f7295c2af8454e39530c268 /docs/ml-guide.md
parentcfa397c126c857bfc9843d9e598a14b7c1e0457f (diff)
downloadspark-6abc45e340d3be5f07236adc104db5f8dda0d514.tar.gz
spark-6abc45e340d3be5f07236adc104db5f8dda0d514.tar.bz2
spark-6abc45e340d3be5f07236adc104db5f8dda0d514.zip
[SPARK-5254][MLLIB] remove developers section from spark.ml guide
Forgot to remove this section in #4052. Author: Xiangrui Meng <meng@databricks.com> Closes #4053 from mengxr/SPARK-5254-update and squashes the following commits: f295bde [Xiangrui Meng] remove developers section from spark.ml guide
Diffstat (limited to 'docs/ml-guide.md')
-rw-r--r--docs/ml-guide.md14
1 files changed, 0 insertions, 14 deletions
diff --git a/docs/ml-guide.md b/docs/ml-guide.md
index 88158fd77e..be178d7689 100644
--- a/docs/ml-guide.md
+++ b/docs/ml-guide.md
@@ -689,17 +689,3 @@ Spark ML currently depends on MLlib and has the same dependencies.
Please see the [MLlib Dependencies guide](mllib-guide.html#Dependencies) for more info.
Spark ML also depends upon Spark SQL, but the relevant parts of Spark SQL do not bring additional dependencies.
-
-# Developers
-
-**Development plan**
-
-If all goes well, `spark.ml` will become the primary ML package at the time of the Spark 1.3 release. Initially, simple wrappers will be used to port algorithms to `spark.ml`, but eventually, code will be moved to `spark.ml` and `spark.mllib` will be deprecated.
-
-**Advice to developers**
-
-During the next development cycle, new algorithms should be contributed to `spark.mllib`, but we welcome patches sent to either package. If an algorithm is best expressed using the new API (e.g., feature transformers), we may ask for developers to use the new `spark.ml` API.
-Wrappers for old and new algorithms can be contributed to `spark.ml`.
-
-Users will be able to use algorithms from either of the two packages. The main difficulty will be the differences in APIs between the two packages.
-