diff options
author | Andrew Or <andrew@databricks.com> | 2016-03-23 13:34:22 -0700 |
---|---|---|
committer | Reynold Xin <rxin@databricks.com> | 2016-03-23 13:34:22 -0700 |
commit | 5dfc01976bb0d72489620b4f32cc12d620bb6260 (patch) | |
tree | 18c0bef5f2c6b0099bd6e8b512b1718b75ecd015 /python/pyspark | |
parent | 6bc4be64f86afcb38e4444c80c9400b7b6b745de (diff) | |
download | spark-5dfc01976bb0d72489620b4f32cc12d620bb6260.tar.gz spark-5dfc01976bb0d72489620b4f32cc12d620bb6260.tar.bz2 spark-5dfc01976bb0d72489620b4f32cc12d620bb6260.zip |
[SPARK-14014][SQL] Replace existing catalog with SessionCatalog
## What changes were proposed in this pull request?
`SessionCatalog`, introduced in #11750, is a catalog that keeps track of temporary functions and tables, and delegates metastore operations to `ExternalCatalog`. This functionality overlaps a lot with the existing `analysis.Catalog`.
As of this commit, `SessionCatalog` and `ExternalCatalog` will no longer be dead code. There are still things that need to be done after this patch, namely:
- SPARK-14013: Properly implement temporary functions in `SessionCatalog`
- SPARK-13879: Decide which DDL/DML commands to support natively in Spark
- SPARK-?????: Implement the ones we do want to support through `SessionCatalog`.
- SPARK-?????: Merge SQL/HiveContext
## How was this patch tested?
This is largely a refactoring task so there are no new tests introduced. The particularly relevant tests are `SessionCatalogSuite` and `ExternalCatalogSuite`.
Author: Andrew Or <andrew@databricks.com>
Author: Yin Huai <yhuai@databricks.com>
Closes #11836 from andrewor14/use-session-catalog.
Diffstat (limited to 'python/pyspark')
-rw-r--r-- | python/pyspark/sql/context.py | 2 |
1 files changed, 1 insertions, 1 deletions
diff --git a/python/pyspark/sql/context.py b/python/pyspark/sql/context.py index 9c2f6a3c56..4008332c84 100644 --- a/python/pyspark/sql/context.py +++ b/python/pyspark/sql/context.py @@ -554,7 +554,7 @@ class SQLContext(object): >>> sqlContext.registerDataFrameAsTable(df, "table1") >>> "table1" in sqlContext.tableNames() True - >>> "table1" in sqlContext.tableNames("db") + >>> "table1" in sqlContext.tableNames("default") True """ if dbName is None: |