aboutsummaryrefslogtreecommitdiff
path: root/sql/hive/src/main/scala/org/apache
diff options
context:
space:
mode:
authorwangzhenhua <wangzhenhua@huawei.com>2017-04-14 19:16:47 +0800
committerWenchen Fan <wenchen@databricks.com>2017-04-14 19:16:47 +0800
commitfb036c4413c2cd4d90880d080f418ec468d6c0fc (patch)
tree8d155d76971538e5ffd6c1f5262653ae813646ce /sql/hive/src/main/scala/org/apache
parent7536e2849df6d63587fbf16b4ecb5db06fed7125 (diff)
downloadspark-fb036c4413c2cd4d90880d080f418ec468d6c0fc.tar.gz
spark-fb036c4413c2cd4d90880d080f418ec468d6c0fc.tar.bz2
spark-fb036c4413c2cd4d90880d080f418ec468d6c0fc.zip
[SPARK-20318][SQL] Use Catalyst type for min/max in ColumnStat for ease of estimation
## What changes were proposed in this pull request? Currently when estimating predicates like col > literal or col = literal, we will update min or max in column stats based on literal value. However, literal value is of Catalyst type (internal type), while min/max is of external type. Then for the next predicate, we again need to do type conversion to compare and update column stats. This is awkward and causes many unnecessary conversions in estimation. To solve this, we use Catalyst type for min/max in `ColumnStat`. Note that the persistent format in metastore is still of external type, so there's no inconsistency for statistics in metastore. This pr also fixes a bug for boolean type in `IN` condition. ## How was this patch tested? The changes for ColumnStat are covered by existing tests. For bug fix, a new test for boolean type in IN condition is added Author: wangzhenhua <wangzhenhua@huawei.com> Closes #17630 from wzhfy/refactorColumnStat.
Diffstat (limited to 'sql/hive/src/main/scala/org/apache')
-rw-r--r--sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveExternalCatalog.scala4
1 files changed, 3 insertions, 1 deletions
diff --git a/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveExternalCatalog.scala b/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveExternalCatalog.scala
index 806f2be5fa..8b0fdf49ce 100644
--- a/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveExternalCatalog.scala
+++ b/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveExternalCatalog.scala
@@ -526,8 +526,10 @@ private[spark] class HiveExternalCatalog(conf: SparkConf, hadoopConf: Configurat
if (stats.rowCount.isDefined) {
statsProperties += STATISTICS_NUM_ROWS -> stats.rowCount.get.toString()
}
+ val colNameTypeMap: Map[String, DataType] =
+ tableDefinition.schema.fields.map(f => (f.name, f.dataType)).toMap
stats.colStats.foreach { case (colName, colStat) =>
- colStat.toMap.foreach { case (k, v) =>
+ colStat.toMap(colName, colNameTypeMap(colName)).foreach { case (k, v) =>
statsProperties += (columnStatKeyPropName(colName, k) -> v)
}
}