aboutsummaryrefslogtreecommitdiff
path: root/project/MimaExcludes.scala
diff options
context:
space:
mode:
authorReynold Xin <rxin@databricks.com>2016-02-13 21:06:31 -0800
committerReynold Xin <rxin@databricks.com>2016-02-13 21:06:31 -0800
commit354d4c24be892271bd9a9eab6ceedfbc5d671c9c (patch)
treec0503ad0c303e6db4882bdbfa356fb78a8dd32fb /project/MimaExcludes.scala
parent388cd9ea8db2e438ebef9dfb894298f843438c43 (diff)
downloadspark-354d4c24be892271bd9a9eab6ceedfbc5d671c9c.tar.gz
spark-354d4c24be892271bd9a9eab6ceedfbc5d671c9c.tar.bz2
spark-354d4c24be892271bd9a9eab6ceedfbc5d671c9c.zip
[SPARK-13296][SQL] Move UserDefinedFunction into sql.expressions.
This pull request has the following changes: 1. Moved UserDefinedFunction into expressions package. This is more consistent with how we structure the packages for window functions and UDAFs. 2. Moved UserDefinedPythonFunction into execution.python package, so we don't have a random private class in the top level sql package. 3. Move everything in execution/python.scala into the newly created execution.python package. Most of the diffs are just straight copy-paste. Author: Reynold Xin <rxin@databricks.com> Closes #11181 from rxin/SPARK-13296.
Diffstat (limited to 'project/MimaExcludes.scala')
-rw-r--r--project/MimaExcludes.scala8
1 files changed, 7 insertions, 1 deletions
diff --git a/project/MimaExcludes.scala b/project/MimaExcludes.scala
index 8611106db0..6abab7f126 100644
--- a/project/MimaExcludes.scala
+++ b/project/MimaExcludes.scala
@@ -235,7 +235,13 @@ object MimaExcludes {
ProblemFilters.exclude[MissingClassProblem]("org.apache.spark.scheduler.cluster.YarnSchedulerBackend$YarnSchedulerEndpoint")
) ++ Seq(
// SPARK-7889
- ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.deploy.history.HistoryServer.org$apache$spark$deploy$history$HistoryServer$@tachSparkUI")
+ ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.deploy.history.HistoryServer.org$apache$spark$deploy$history$HistoryServer$@tachSparkUI"),
+ // SPARK-13296
+ ProblemFilters.exclude[IncompatibleResultTypeProblem]("org.apache.spark.sql.UDFRegistration.register"),
+ ProblemFilters.exclude[MissingClassProblem]("org.apache.spark.sql.UserDefinedPythonFunction$"),
+ ProblemFilters.exclude[MissingClassProblem]("org.apache.spark.sql.UserDefinedPythonFunction"),
+ ProblemFilters.exclude[MissingClassProblem]("org.apache.spark.sql.UserDefinedFunction"),
+ ProblemFilters.exclude[MissingClassProblem]("org.apache.spark.sql.UserDefinedFunction$")
)
case v if v.startsWith("1.6") =>
Seq(