aboutsummaryrefslogtreecommitdiff
path: root/build
diff options
context:
space:
mode:
authorc-sahuja <sahuja@cloudera.com>2016-12-06 19:03:23 -0800
committerReynold Xin <rxin@databricks.com>2016-12-06 19:03:23 -0800
commit01c7c6b884244ac1a57e332c3aea669488ad9dc0 (patch)
treeb05f0091ba0cce806959dd19f01d3dc6001fd0ed /build
parent539bb3cf9573be5cd86e7e6502523ce89c0de170 (diff)
downloadspark-01c7c6b884244ac1a57e332c3aea669488ad9dc0.tar.gz
spark-01c7c6b884244ac1a57e332c3aea669488ad9dc0.tar.bz2
spark-01c7c6b884244ac1a57e332c3aea669488ad9dc0.zip
Update Spark documentation to provide information on how to create External Table
## What changes were proposed in this pull request? Although, currently, the saveAsTable does not provide an API to save the table as an external table from a DataFrame, we can achieve this functionality by using options on DataFrameWriter where the key for the map is the String: "path" and the value is another String which is the location of the external table itself. This can be provided before the call to saveAsTable is performed. ## How was this patch tested? Documentation was reviewed for formatting and content after the push was performed on the branch. ![updated documentation](https://cloud.githubusercontent.com/assets/15376052/20953147/4cfcf308-bc57-11e6-807c-e21fb774a760.PNG) Author: c-sahuja <sahuja@cloudera.com> Closes #16185 from c-sahuja/createExternalTable.
Diffstat (limited to 'build')
0 files changed, 0 insertions, 0 deletions