aboutsummaryrefslogtreecommitdiff
path: root/project
diff options
context:
space:
mode:
authorHerman van Hovell <hvanhovell@databricks.com>2016-10-01 00:50:16 -0700
committerReynold Xin <rxin@databricks.com>2016-10-01 00:50:16 -0700
commitaf6ece33d39cf305bd4a211d08a2f8e910c69bc1 (patch)
tree56ca7515a393966478b738836e00b4f3c2d4df0d /project
parent4bcd9b728b8df74756d16b27725c2db7c523d4b2 (diff)
downloadspark-af6ece33d39cf305bd4a211d08a2f8e910c69bc1.tar.gz
spark-af6ece33d39cf305bd4a211d08a2f8e910c69bc1.tar.bz2
spark-af6ece33d39cf305bd4a211d08a2f8e910c69bc1.zip
[SPARK-17717][SQL] Add Exist/find methods to Catalog [FOLLOW-UP]
## What changes were proposed in this pull request? We added find and exists methods for Databases, Tables and Functions to the user facing Catalog in PR https://github.com/apache/spark/pull/15301. However, it was brought up that the semantics of the `find` methods are more in line a `get` method (get an object or else fail). So we rename these in this PR. ## How was this patch tested? Existing tests. Author: Herman van Hovell <hvanhovell@databricks.com> Closes #15308 from hvanhovell/SPARK-17717-2.
Diffstat (limited to 'project')
-rw-r--r--project/MimaExcludes.scala10
1 files changed, 4 insertions, 6 deletions
diff --git a/project/MimaExcludes.scala b/project/MimaExcludes.scala
index 2ffe0ac9bc..7362041428 100644
--- a/project/MimaExcludes.scala
+++ b/project/MimaExcludes.scala
@@ -48,14 +48,12 @@ object MimaExcludes {
// [SPARK-16240] ML persistence backward compatibility for LDA
ProblemFilters.exclude[MissingTypesProblem]("org.apache.spark.ml.clustering.LDA$"),
// [SPARK-17717] Add Find and Exists method to Catalog.
- ProblemFilters.exclude[ReversedMissingMethodProblem]("org.apache.spark.sql.catalog.Catalog.findDatabase"),
- ProblemFilters.exclude[ReversedMissingMethodProblem]("org.apache.spark.sql.catalog.Catalog.findTable"),
- ProblemFilters.exclude[ReversedMissingMethodProblem]("org.apache.spark.sql.catalog.Catalog.findFunction"),
- ProblemFilters.exclude[ReversedMissingMethodProblem]("org.apache.spark.sql.catalog.Catalog.findColumn"),
+ ProblemFilters.exclude[ReversedMissingMethodProblem]("org.apache.spark.sql.catalog.Catalog.getDatabase"),
+ ProblemFilters.exclude[ReversedMissingMethodProblem]("org.apache.spark.sql.catalog.Catalog.getTable"),
+ ProblemFilters.exclude[ReversedMissingMethodProblem]("org.apache.spark.sql.catalog.Catalog.getFunction"),
ProblemFilters.exclude[ReversedMissingMethodProblem]("org.apache.spark.sql.catalog.Catalog.databaseExists"),
ProblemFilters.exclude[ReversedMissingMethodProblem]("org.apache.spark.sql.catalog.Catalog.tableExists"),
- ProblemFilters.exclude[ReversedMissingMethodProblem]("org.apache.spark.sql.catalog.Catalog.functionExists"),
- ProblemFilters.exclude[ReversedMissingMethodProblem]("org.apache.spark.sql.catalog.Catalog.columnExists")
+ ProblemFilters.exclude[ReversedMissingMethodProblem]("org.apache.spark.sql.catalog.Catalog.functionExists")
)
}