diff options
author | Andrew Or <andrew@databricks.com> | 2015-05-05 09:37:49 -0700 |
---|---|---|
committer | Andrew Or <andrew@databricks.com> | 2015-05-05 09:37:49 -0700 |
commit | 57e9f29e17d97ed9d0f110fb2ce5a075b854a841 (patch) | |
tree | 277e057cac90343112f85a10e5667bae32490d5f /core | |
parent | 1fdabf8dcdb31391fec3952d312eb0ac59ece43b (diff) | |
download | spark-57e9f29e17d97ed9d0f110fb2ce5a075b854a841.tar.gz spark-57e9f29e17d97ed9d0f110fb2ce5a075b854a841.tar.bz2 spark-57e9f29e17d97ed9d0f110fb2ce5a075b854a841.zip |
[SPARK-7318] [STREAMING] DStream cleans objects that are not closures
I added a check in `ClosureCleaner#clean` to fail fast if this is detected in the future. tdas
Author: Andrew Or <andrew@databricks.com>
Closes #5860 from andrewor14/streaming-closure-cleaner and squashes the following commits:
8e971d7 [Andrew Or] Do not throw exception if object to clean is not closure
5ee4e25 [Andrew Or] Fix tests
eed3390 [Andrew Or] Merge branch 'master' of github.com:apache/spark into streaming-closure-cleaner
67eeff4 [Andrew Or] Add tests
a4fa768 [Andrew Or] Clean the closure, not the RDD
Diffstat (limited to 'core')
-rw-r--r-- | core/src/main/scala/org/apache/spark/util/ClosureCleaner.scala | 5 |
1 files changed, 5 insertions, 0 deletions
diff --git a/core/src/main/scala/org/apache/spark/util/ClosureCleaner.scala b/core/src/main/scala/org/apache/spark/util/ClosureCleaner.scala index 19fe6cb9de..6fe32e469c 100644 --- a/core/src/main/scala/org/apache/spark/util/ClosureCleaner.scala +++ b/core/src/main/scala/org/apache/spark/util/ClosureCleaner.scala @@ -179,6 +179,11 @@ private[spark] object ClosureCleaner extends Logging { cleanTransitively: Boolean, accessedFields: Map[Class[_], Set[String]]): Unit = { + if (!isClosure(func.getClass)) { + logWarning("Expected a closure; got " + func.getClass.getName) + return + } + // TODO: clean all inner closures first. This requires us to find the inner objects. // TODO: cache outerClasses / innerClasses / accessedFields |