aboutsummaryrefslogtreecommitdiff
diff options
context:
space:
mode:
authorXiangrui Meng <meng@databricks.com>2015-01-14 18:54:17 -0800
committerXiangrui Meng <meng@databricks.com>2015-01-14 18:54:28 -0800
commit3813547a163056e76eb11c77ccad764356d726a5 (patch)
treeb40651da76924a96ea5a2425062920755ad11877
parent47fb0d0ea4c0d4316e5ceb06e03c430a2370713b (diff)
downloadspark-3813547a163056e76eb11c77ccad764356d726a5.tar.gz
spark-3813547a163056e76eb11c77ccad764356d726a5.tar.bz2
spark-3813547a163056e76eb11c77ccad764356d726a5.zip
[SPARK-5254][MLLIB] remove developers section from spark.ml guide
Forgot to remove this section in #4052. Author: Xiangrui Meng <meng@databricks.com> Closes #4053 from mengxr/SPARK-5254-update and squashes the following commits: f295bde [Xiangrui Meng] remove developers section from spark.ml guide (cherry picked from commit 6abc45e340d3be5f07236adc104db5f8dda0d514) Signed-off-by: Xiangrui Meng <meng@databricks.com>
-rw-r--r--docs/ml-guide.md14
1 files changed, 0 insertions, 14 deletions
diff --git a/docs/ml-guide.md b/docs/ml-guide.md
index 88158fd77e..be178d7689 100644
--- a/docs/ml-guide.md
+++ b/docs/ml-guide.md
@@ -689,17 +689,3 @@ Spark ML currently depends on MLlib and has the same dependencies.
Please see the [MLlib Dependencies guide](mllib-guide.html#Dependencies) for more info.
Spark ML also depends upon Spark SQL, but the relevant parts of Spark SQL do not bring additional dependencies.
-
-# Developers
-
-**Development plan**
-
-If all goes well, `spark.ml` will become the primary ML package at the time of the Spark 1.3 release. Initially, simple wrappers will be used to port algorithms to `spark.ml`, but eventually, code will be moved to `spark.ml` and `spark.mllib` will be deprecated.
-
-**Advice to developers**
-
-During the next development cycle, new algorithms should be contributed to `spark.mllib`, but we welcome patches sent to either package. If an algorithm is best expressed using the new API (e.g., feature transformers), we may ask for developers to use the new `spark.ml` API.
-Wrappers for old and new algorithms can be contributed to `spark.ml`.
-
-Users will be able to use algorithms from either of the two packages. The main difficulty will be the differences in APIs between the two packages.
-