diff options
author | Wenchen Fan <wenchen@databricks.com> | 2016-07-25 22:02:00 +0800 |
---|---|---|
committer | Cheng Lian <lian@databricks.com> | 2016-07-25 22:02:00 +0800 |
commit | d27d362ebae0c4a5cc6c99f13ef20049214dd4f9 (patch) | |
tree | 0f214921fc8ac1aff74f0c8e4f171adb724c43fc /sql/hive/src/main | |
parent | 7ffd99ec5f267730734431097cbb700ad074bebe (diff) | |
download | spark-d27d362ebae0c4a5cc6c99f13ef20049214dd4f9.tar.gz spark-d27d362ebae0c4a5cc6c99f13ef20049214dd4f9.tar.bz2 spark-d27d362ebae0c4a5cc6c99f13ef20049214dd4f9.zip |
[SPARK-16660][SQL] CreateViewCommand should not take CatalogTable
## What changes were proposed in this pull request?
`CreateViewCommand` only needs some information of a `CatalogTable`, but not all of them. We have some tricks(e.g. we need to check the table type is `VIEW`, we need to make `CatalogColumn.dataType` nullable) to allow it to take a `CatalogTable`.
This PR cleans it up and only pass in necessary information to `CreateViewCommand`.
## How was this patch tested?
existing tests.
Author: Wenchen Fan <wenchen@databricks.com>
Closes #14297 from cloud-fan/minor2.
Diffstat (limited to 'sql/hive/src/main')
-rw-r--r-- | sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveMetastoreCatalog.scala | 2 |
1 files changed, 0 insertions, 2 deletions
diff --git a/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveMetastoreCatalog.scala b/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveMetastoreCatalog.scala index d308a31061..db970785a7 100644 --- a/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveMetastoreCatalog.scala +++ b/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveMetastoreCatalog.scala @@ -171,8 +171,6 @@ private[hive] class HiveMetastoreCatalog(sparkSession: SparkSession) extends Log } else if (table.tableType == CatalogTableType.VIEW) { val viewText = table.viewText.getOrElse(sys.error("Invalid view without text.")) alias match { - // because hive use things like `_c0` to build the expanded text - // currently we cannot support view from "create view v1(c1) as ..." case None => SubqueryAlias(table.identifier.table, sparkSession.sessionState.sqlParser.parsePlan(viewText)) |