diff options
author | gatorsmile <gatorsmile@gmail.com> | 2016-03-29 17:39:52 -0700 |
---|---|---|
committer | Andrew Or <andrew@databricks.com> | 2016-03-29 17:39:52 -0700 |
commit | b66b97cd04067e1ec344fa2e28dd91e7ef937af5 (patch) | |
tree | 0d879872c83765c79e9a3de6483835dd5b7bf6c3 /sql/hive | |
parent | e1f6845391078726f60e760f0ea68ccf81f9eca9 (diff) | |
download | spark-b66b97cd04067e1ec344fa2e28dd91e7ef937af5.tar.gz spark-b66b97cd04067e1ec344fa2e28dd91e7ef937af5.tar.bz2 spark-b66b97cd04067e1ec344fa2e28dd91e7ef937af5.zip |
[SPARK-14124][SQL] Implement Database-related DDL Commands
#### What changes were proposed in this pull request?
This PR is to implement the following four Database-related DDL commands:
- `CREATE DATABASE|SCHEMA [IF NOT EXISTS] database_name`
- `DROP DATABASE [IF EXISTS] database_name [RESTRICT|CASCADE]`
- `DESCRIBE DATABASE [EXTENDED] db_name`
- `ALTER (DATABASE|SCHEMA) database_name SET DBPROPERTIES (property_name=property_value, ...)`
Another PR will be submitted to handle the unsupported commands. In the Database-related DDL commands, we will issue an error exception for `ALTER (DATABASE|SCHEMA) database_name SET OWNER [USER|ROLE] user_or_role`.
cc yhuai andrewor14 rxin Could you review the changes? Is it in the right direction? Thanks!
#### How was this patch tested?
Added a few test cases in `command/DDLSuite.scala` for testing DDL command execution in `SQLContext`. Since `HiveContext` also shares the same implementation, the existing test cases in `\hive` also verifies the correctness of these commands.
Author: gatorsmile <gatorsmile@gmail.com>
Author: xiaoli <lixiao1983@gmail.com>
Author: Xiao Li <xiaoli@Xiaos-MacBook-Pro.local>
Closes #12009 from gatorsmile/dbDDL.
Diffstat (limited to 'sql/hive')
-rw-r--r-- | sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveSessionCatalog.scala | 8 |
1 files changed, 8 insertions, 0 deletions
diff --git a/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveSessionCatalog.scala b/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveSessionCatalog.scala index ff12245e8d..1cd783e63a 100644 --- a/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveSessionCatalog.scala +++ b/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveSessionCatalog.scala @@ -17,6 +17,9 @@ package org.apache.spark.sql.hive +import org.apache.hadoop.fs.Path +import org.apache.hadoop.hive.conf.HiveConf + import org.apache.spark.sql.catalyst.TableIdentifier import org.apache.spark.sql.catalyst.analysis.FunctionRegistry import org.apache.spark.sql.catalyst.catalog.SessionCatalog @@ -59,6 +62,11 @@ class HiveSessionCatalog( // | Methods and fields for interacting with HiveMetastoreCatalog | // ---------------------------------------------------------------- + override def getDefaultDBPath(db: String): String = { + val defaultPath = context.hiveconf.getVar(HiveConf.ConfVars.METASTOREWAREHOUSE) + new Path(new Path(defaultPath), db + ".db").toString + } + // Catalog for handling data source tables. TODO: This really doesn't belong here since it is // essentially a cache for metastore tables. However, it relies on a lot of session-specific // things so it would be a lot of work to split its functionality between HiveSessionCatalog |