diff options
author | Hossein <hossein@databricks.com> | 2017-03-27 08:53:45 -0700 |
---|---|---|
committer | Xiangrui Meng <meng@databricks.com> | 2017-03-27 08:53:45 -0700 |
commit | 0588dc7c0a9f3180dddae0dc202a6d41eb43464f (patch) | |
tree | 115bf5fcf8843d63899c13141553906b14b27096 /core | |
parent | 890493458de396cfcffdd71233cfdd39e834944b (diff) | |
download | spark-0588dc7c0a9f3180dddae0dc202a6d41eb43464f.tar.gz spark-0588dc7c0a9f3180dddae0dc202a6d41eb43464f.tar.bz2 spark-0588dc7c0a9f3180dddae0dc202a6d41eb43464f.zip |
[SPARK-20088] Do not create new SparkContext in SparkR createSparkContext
## What changes were proposed in this pull request?
Instead of creating new `JavaSparkContext` we use `SparkContext.getOrCreate`.
## How was this patch tested?
Existing tests
Author: Hossein <hossein@databricks.com>
Closes #17423 from falaki/SPARK-20088.
Diffstat (limited to 'core')
-rw-r--r-- | core/src/main/scala/org/apache/spark/api/r/RRDD.scala | 2 |
1 files changed, 1 insertions, 1 deletions
diff --git a/core/src/main/scala/org/apache/spark/api/r/RRDD.scala b/core/src/main/scala/org/apache/spark/api/r/RRDD.scala index 72ae0340aa..295355c7bf 100644 --- a/core/src/main/scala/org/apache/spark/api/r/RRDD.scala +++ b/core/src/main/scala/org/apache/spark/api/r/RRDD.scala @@ -136,7 +136,7 @@ private[r] object RRDD { .mkString(File.separator)) } - val jsc = new JavaSparkContext(sparkConf) + val jsc = new JavaSparkContext(SparkContext.getOrCreate(sparkConf)) jars.foreach { jar => jsc.addJar(jar) } |