diff options
author | Xiao Li <gatorsmile@gmail.com> | 2017-04-24 17:21:42 +0800 |
---|---|---|
committer | Wenchen Fan <wenchen@databricks.com> | 2017-04-24 17:21:42 +0800 |
commit | 776a2c0e91dfea170ea1c489118e1d42c4121f35 (patch) | |
tree | d62e429620a4dd5505fc117e5e29eabd8437e104 /sql/hive | |
parent | e9f97154bc4af60376a550238315d7fc57099f9c (diff) | |
download | spark-776a2c0e91dfea170ea1c489118e1d42c4121f35.tar.gz spark-776a2c0e91dfea170ea1c489118e1d42c4121f35.tar.bz2 spark-776a2c0e91dfea170ea1c489118e1d42c4121f35.zip |
[SPARK-20439][SQL] Fix Catalog API listTables and getTable when failed to fetch table metadata
### What changes were proposed in this pull request?
`spark.catalog.listTables` and `spark.catalog.getTable` does not work if we are unable to retrieve table metadata due to any reason (e.g., table serde class is not accessible or the table type is not accepted by Spark SQL). After this PR, the APIs still return the corresponding Table without the description and tableType)
### How was this patch tested?
Added a test case
Author: Xiao Li <gatorsmile@gmail.com>
Closes #17730 from gatorsmile/listTables.
Diffstat (limited to 'sql/hive')
-rw-r--r-- | sql/hive/src/test/scala/org/apache/spark/sql/hive/execution/HiveDDLSuite.scala | 8 |
1 files changed, 8 insertions, 0 deletions
diff --git a/sql/hive/src/test/scala/org/apache/spark/sql/hive/execution/HiveDDLSuite.scala b/sql/hive/src/test/scala/org/apache/spark/sql/hive/execution/HiveDDLSuite.scala index 3906968aaf..16a99321ba 100644 --- a/sql/hive/src/test/scala/org/apache/spark/sql/hive/execution/HiveDDLSuite.scala +++ b/sql/hive/src/test/scala/org/apache/spark/sql/hive/execution/HiveDDLSuite.scala @@ -1197,6 +1197,14 @@ class HiveDDLSuite s"CREATE INDEX $indexName ON TABLE $tabName (a) AS 'COMPACT' WITH DEFERRED REBUILD") val indexTabName = spark.sessionState.catalog.listTables("default", s"*$indexName*").head.table + + // Even if index tables exist, listTables and getTable APIs should still work + checkAnswer( + spark.catalog.listTables().toDF(), + Row(indexTabName, "default", null, null, false) :: + Row(tabName, "default", null, "MANAGED", false) :: Nil) + assert(spark.catalog.getTable("default", indexTabName).name === indexTabName) + intercept[TableAlreadyExistsException] { sql(s"CREATE TABLE $indexTabName(b int)") } |