aboutsummaryrefslogtreecommitdiff
path: root/project/MimaBuild.scala
diff options
context:
space:
mode:
authorJosh Rosen <joshrosen@databricks.com>2016-03-16 23:02:25 -0700
committerReynold Xin <rxin@databricks.com>2016-03-16 23:02:25 -0700
commit82066a166768399eada42f3d65150becf43320b3 (patch)
tree3d4b9519cfbe4372af27ccfb0b10b5ab37bdf3ea /project/MimaBuild.scala
parent5faba9faccb5ce43790c43284769e0f890340606 (diff)
downloadspark-82066a166768399eada42f3d65150becf43320b3.tar.gz
spark-82066a166768399eada42f3d65150becf43320b3.tar.bz2
spark-82066a166768399eada42f3d65150becf43320b3.zip
[SPARK-13948] MiMa check should catch if the visibility changes to private
MiMa excludes are currently generated using both the current Spark version's classes and Spark 1.2.0's classes, but this doesn't make sense: we should only be ignoring classes which were `private` in the previous Spark version, not classes which became private in the current version. This patch updates `dev/mima` to only generate excludes with respect to the previous artifacts that MiMa checks against. It also updates `MimaBuild` so that `excludeClass` only applies directly to the class being excluded and not to its companion object (since a class and its companion object can have different accessibility). Author: Josh Rosen <joshrosen@databricks.com> Closes #11774 from JoshRosen/SPARK-13948.
Diffstat (limited to 'project/MimaBuild.scala')
-rw-r--r--project/MimaBuild.scala7
1 files changed, 2 insertions, 5 deletions
diff --git a/project/MimaBuild.scala b/project/MimaBuild.scala
index 4adf64a5a0..acf7b8961e 100644
--- a/project/MimaBuild.scala
+++ b/project/MimaBuild.scala
@@ -42,14 +42,11 @@ object MimaBuild {
ProblemFilters.exclude[IncompatibleFieldTypeProblem](fullName)
)
- // Exclude a single class and its corresponding object
+ // Exclude a single class
def excludeClass(className: String) = Seq(
excludePackage(className),
ProblemFilters.exclude[MissingClassProblem](className),
- ProblemFilters.exclude[MissingTypesProblem](className),
- excludePackage(className + "$"),
- ProblemFilters.exclude[MissingClassProblem](className + "$"),
- ProblemFilters.exclude[MissingTypesProblem](className + "$")
+ ProblemFilters.exclude[MissingTypesProblem](className)
)
// Exclude a Spark class, that is in the package org.apache.spark