From 8c49cebce572330fc84362662a9e3e8f7625bf5d Mon Sep 17 00:00:00 2001 From: Josh Rosen Date: Wed, 27 Apr 2016 17:34:55 -0700 Subject: [SPARK-14966] SizeEstimator should ignore classes in the scala.reflect package In local profiling, I noticed SizeEstimator spending tons of time estimating the size of objects which contain TypeTag or ClassTag fields. The problem with these tags is that they reference global Scala reflection objects, which, in turn, reference many singletons, such as TestHive. This throws off the accuracy of the size estimation and wastes tons of time traversing a huge object graph. As a result, I think that SizeEstimator should ignore any classes in the `scala.reflect` package. Author: Josh Rosen Closes #12741 from JoshRosen/ignore-scala-reflect-in-size-estimator. --- core/src/main/scala/org/apache/spark/util/SizeEstimator.scala | 3 +++ 1 file changed, 3 insertions(+) diff --git a/core/src/main/scala/org/apache/spark/util/SizeEstimator.scala b/core/src/main/scala/org/apache/spark/util/SizeEstimator.scala index 6861a75612..386fdfd218 100644 --- a/core/src/main/scala/org/apache/spark/util/SizeEstimator.scala +++ b/core/src/main/scala/org/apache/spark/util/SizeEstimator.scala @@ -207,6 +207,9 @@ object SizeEstimator extends Logging { val cls = obj.getClass if (cls.isArray) { visitArray(obj, cls, state) + } else if (cls.getName.startsWith("scala.reflect")) { + // Many objects in the scala.reflect package reference global reflection objects which, in + // turn, reference many other large global objects. Do nothing in this case. } else if (obj.isInstanceOf[ClassLoader] || obj.isInstanceOf[Class[_]]) { // Hadoop JobConfs created in the interpreter have a ClassLoader, which greatly confuses // the size estimator since it references the whole REPL. Do nothing in this case. In -- cgit v1.2.3