aboutsummaryrefslogtreecommitdiff
path: root/project/MimaExcludes.scala
diff options
context:
space:
mode:
authorAlex Bozarth <ajbozart@us.ibm.com>2016-05-09 11:51:37 -0700
committerAndrew Or <andrew@databricks.com>2016-05-09 11:51:37 -0700
commitc3e23bc0c3e87546d0575c3c4c45a2b0e2dfec6a (patch)
tree4d6d98f7a7db550c09d1ce4dfa775222080faa94 /project/MimaExcludes.scala
parent7bf9b12019bb20470b726a7233d60ce38a9c52cc (diff)
downloadspark-c3e23bc0c3e87546d0575c3c4c45a2b0e2dfec6a.tar.gz
spark-c3e23bc0c3e87546d0575c3c4c45a2b0e2dfec6a.tar.bz2
spark-c3e23bc0c3e87546d0575c3c4c45a2b0e2dfec6a.zip
[SPARK-10653][CORE] Remove unnecessary things from SparkEnv
## What changes were proposed in this pull request? Removed blockTransferService and sparkFilesDir from SparkEnv since they're rarely used and don't need to be in stored in the env. Edited their few usages to accommodate the change. ## How was this patch tested? ran dev/run-tests locally Author: Alex Bozarth <ajbozart@us.ibm.com> Closes #12970 from ajbozarth/spark10653.
Diffstat (limited to 'project/MimaExcludes.scala')
-rw-r--r--project/MimaExcludes.scala4
1 files changed, 4 insertions, 0 deletions
diff --git a/project/MimaExcludes.scala b/project/MimaExcludes.scala
index 33e0db606c..a5d57e1b01 100644
--- a/project/MimaExcludes.scala
+++ b/project/MimaExcludes.scala
@@ -694,6 +694,10 @@ object MimaExcludes {
ProblemFilters.exclude[DirectMissingMethodProblem]("org.apache.spark.ml.classification.LogisticRegressionModel.weights"),
ProblemFilters.exclude[DirectMissingMethodProblem]("org.apache.spark.ml.regression.LinearRegressionModel.weights")
) ++ Seq(
+ // [SPARK-10653] [Core] Remove unnecessary things from SparkEnv
+ ProblemFilters.exclude[DirectMissingMethodProblem]("org.apache.spark.SparkEnv.sparkFilesDir"),
+ ProblemFilters.exclude[DirectMissingMethodProblem]("org.apache.spark.SparkEnv.blockTransferService")
+ ) ++ Seq(
// SPARK-14654: New accumulator API
ProblemFilters.exclude[MissingTypesProblem]("org.apache.spark.ExceptionFailure$"),
ProblemFilters.exclude[DirectMissingMethodProblem]("org.apache.spark.ExceptionFailure.apply"),