aboutsummaryrefslogtreecommitdiff
path: root/R/pkg/inst
diff options
context:
space:
mode:
authorForest Fang <forest.fang@outlook.com>2015-10-22 09:34:07 -0700
committerShivaram Venkataraman <shivaram@cs.berkeley.edu>2015-10-22 09:34:07 -0700
commit94e2064fa1b04c05c805d9175c7c78bf583db5c6 (patch)
tree04c19ace915e6c433173221b28d950a226a30258 /R/pkg/inst
parentc03b6d11589102b91f08728519e8520025db91e1 (diff)
downloadspark-94e2064fa1b04c05c805d9175c7c78bf583db5c6.tar.gz
spark-94e2064fa1b04c05c805d9175c7c78bf583db5c6.tar.bz2
spark-94e2064fa1b04c05c805d9175c7c78bf583db5c6.zip
[SPARK-11244][SPARKR] sparkR.stop() should remove SQLContext
SparkR should remove `.sparkRSQLsc` and `.sparkRHivesc` when `sparkR.stop()` is called. Otherwise even when SparkContext is reinitialized, `sparkRSQL.init` returns the stale copy of the object and complains: ```r sc <- sparkR.init("local") sqlContext <- sparkRSQL.init(sc) sparkR.stop() sc <- sparkR.init("local") sqlContext <- sparkRSQL.init(sc) sqlContext ``` producing ```r Error in callJMethod(x, "getClass") : Invalid jobj 1. If SparkR was restarted, Spark operations need to be re-executed. ``` I have added the check and removal only when SparkContext itself is initialized. I have also added corresponding test for this fix. Let me know if you want me to move the test to SQL test suite instead. p.s. I tried lint-r but ended up a lots of errors on existing code. Author: Forest Fang <forest.fang@outlook.com> Closes #9205 from saurfang/sparkR.stop.
Diffstat (limited to 'R/pkg/inst')
-rw-r--r--R/pkg/inst/tests/test_context.R10
1 files changed, 10 insertions, 0 deletions
diff --git a/R/pkg/inst/tests/test_context.R b/R/pkg/inst/tests/test_context.R
index 513bbc8e62..e99815ed15 100644
--- a/R/pkg/inst/tests/test_context.R
+++ b/R/pkg/inst/tests/test_context.R
@@ -26,6 +26,16 @@ test_that("repeatedly starting and stopping SparkR", {
}
})
+test_that("repeatedly starting and stopping SparkR SQL", {
+ for (i in 1:4) {
+ sc <- sparkR.init()
+ sqlContext <- sparkRSQL.init(sc)
+ df <- createDataFrame(sqlContext, data.frame(a = 1:20))
+ expect_equal(count(df), 20)
+ sparkR.stop()
+ }
+})
+
test_that("rdd GC across sparkR.stop", {
sparkR.stop()
sc <- sparkR.init() # sc should get id 0