aboutsummaryrefslogtreecommitdiff
path: root/docs/_plugins
diff options
context:
space:
mode:
authorWenchen Fan <wenchen@databricks.com>2016-05-04 14:40:54 -0700
committerAndrew Or <andrew@databricks.com>2016-05-04 14:40:54 -0700
commita432a2b86081a18cebf4085cead702436960f6c7 (patch)
tree84f12ce725fe5e2bb8e338b3d1aec164e61d6dd8 /docs/_plugins
parenteb019af9a9cadb127eab1b6d30312169ed90f808 (diff)
downloadspark-a432a2b86081a18cebf4085cead702436960f6c7.tar.gz
spark-a432a2b86081a18cebf4085cead702436960f6c7.tar.bz2
spark-a432a2b86081a18cebf4085cead702436960f6c7.zip
[SPARK-15116] In REPL we should create SparkSession first and get SparkContext from it
## What changes were proposed in this pull request? see https://github.com/apache/spark/pull/12873#discussion_r61993910. The problem is, if we create `SparkContext` first and then call `SparkSession.builder.enableHiveSupport().getOrCreate()`, we will reuse the existing `SparkContext` and the hive flag won't be set. ## How was this patch tested? verified it locally. Author: Wenchen Fan <wenchen@databricks.com> Closes #12890 from cloud-fan/repl.
Diffstat (limited to 'docs/_plugins')
0 files changed, 0 insertions, 0 deletions