aboutsummaryrefslogtreecommitdiff
path: root/project
diff options
context:
space:
mode:
authorBryan Cutler <cutlerb@gmail.com>2017-01-31 15:42:36 -0800
committerHolden Karau <holden@us.ibm.com>2017-01-31 15:42:36 -0800
commit57d70d26c88819360cdc806e7124aa2cc1b9e4c5 (patch)
tree989a46211f9f6e7069dd77a41bf3f805716f863d /project
parentce112cec4f9bff222aa256893f94c316662a2a7e (diff)
downloadspark-57d70d26c88819360cdc806e7124aa2cc1b9e4c5.tar.gz
spark-57d70d26c88819360cdc806e7124aa2cc1b9e4c5.tar.bz2
spark-57d70d26c88819360cdc806e7124aa2cc1b9e4c5.zip
[SPARK-17161][PYSPARK][ML] Add PySpark-ML JavaWrapper convenience function to create Py4J JavaArrays
## What changes were proposed in this pull request? Adding convenience function to Python `JavaWrapper` so that it is easy to create a Py4J JavaArray that is compatible with current class constructors that have a Scala `Array` as input so that it is not necessary to have a Java/Python friendly constructor. The function takes a Java class as input that is used by Py4J to create the Java array of the given class. As an example, `OneVsRest` has been updated to use this and the alternate constructor is removed. ## How was this patch tested? Added unit tests for the new convenience function and updated `OneVsRest` doctests which use this to persist the model. Author: Bryan Cutler <cutlerb@gmail.com> Closes #14725 from BryanCutler/pyspark-new_java_array-CountVectorizer-SPARK-17161.
Diffstat (limited to 'project')
-rw-r--r--project/MimaExcludes.scala5
1 files changed, 4 insertions, 1 deletions
diff --git a/project/MimaExcludes.scala b/project/MimaExcludes.scala
index 7e6e143523..9d359427f2 100644
--- a/project/MimaExcludes.scala
+++ b/project/MimaExcludes.scala
@@ -54,7 +54,10 @@ object MimaExcludes {
// [SPARK-19069] [CORE] Expose task 'status' and 'duration' in spark history server REST API.
ProblemFilters.exclude[DirectMissingMethodProblem]("org.apache.spark.status.api.v1.TaskData.this"),
ProblemFilters.exclude[DirectMissingMethodProblem]("org.apache.spark.status.api.v1.TaskData.<init>$default$10"),
- ProblemFilters.exclude[DirectMissingMethodProblem]("org.apache.spark.status.api.v1.TaskData.<init>$default$11")
+ ProblemFilters.exclude[DirectMissingMethodProblem]("org.apache.spark.status.api.v1.TaskData.<init>$default$11"),
+
+ // [SPARK-17161] Removing Python-friendly constructors not needed
+ ProblemFilters.exclude[DirectMissingMethodProblem]("org.apache.spark.ml.classification.OneVsRestModel.this")
)
// Exclude rules for 2.1.x