aboutsummaryrefslogtreecommitdiff
path: root/R/pkg/vignettes/sparkr-vignettes.Rmd
diff options
context:
space:
mode:
authorFelix Cheung <felixcheung_m@hotmail.com>2016-12-04 20:25:11 -0800
committerShivaram Venkataraman <shivaram@cs.berkeley.edu>2016-12-04 20:25:11 -0800
commitb019b3a8ac49336e657f5e093fa2fba77f8d12d2 (patch)
tree73887a634a8695b5699b1647e7ec7e0190702b7b /R/pkg/vignettes/sparkr-vignettes.Rmd
parentd9eb4c7215f26dd05527c0b9980af35087ab9d64 (diff)
downloadspark-b019b3a8ac49336e657f5e093fa2fba77f8d12d2.tar.gz
spark-b019b3a8ac49336e657f5e093fa2fba77f8d12d2.tar.bz2
spark-b019b3a8ac49336e657f5e093fa2fba77f8d12d2.zip
[SPARK-18643][SPARKR] SparkR hangs at session start when installed as a package without Spark
## What changes were proposed in this pull request? If SparkR is running as a package and it has previously downloaded Spark Jar it should be able to run as before without having to set SPARK_HOME. Basically with this bug the auto install Spark will only work in the first session. This seems to be a regression on the earlier behavior. Fix is to always try to install or check for the cached Spark if running in an interactive session. As discussed before, we should probably only install Spark iff running in an interactive session (R shell, RStudio etc) ## How was this patch tested? Manually Author: Felix Cheung <felixcheung_m@hotmail.com> Closes #16077 from felixcheung/rsessioninteractive.
Diffstat (limited to 'R/pkg/vignettes/sparkr-vignettes.Rmd')
-rw-r--r--R/pkg/vignettes/sparkr-vignettes.Rmd4
1 files changed, 2 insertions, 2 deletions
diff --git a/R/pkg/vignettes/sparkr-vignettes.Rmd b/R/pkg/vignettes/sparkr-vignettes.Rmd
index 73a5e26a3b..a36f8fc0c1 100644
--- a/R/pkg/vignettes/sparkr-vignettes.Rmd
+++ b/R/pkg/vignettes/sparkr-vignettes.Rmd
@@ -94,13 +94,13 @@ sparkR.session.stop()
Different from many other R packages, to use SparkR, you need an additional installation of Apache Spark. The Spark installation will be used to run a backend process that will compile and execute SparkR programs.
-If you don't have Spark installed on the computer, you may download it from [Apache Spark Website](http://spark.apache.org/downloads.html). Alternatively, we provide an easy-to-use function `install.spark` to complete this process. You don't have to call it explicitly. We will check the installation when `sparkR.session` is called and `install.spark` function will be triggered automatically if no installation is found.
+After installing the SparkR package, you can call `sparkR.session` as explained in the previous section to start and it will check for the Spark installation. If you are working with SparkR from an interactive shell (eg. R, RStudio) then Spark is downloaded and cached automatically if it is not found. Alternatively, we provide an easy-to-use function `install.spark` for running this manually. If you don't have Spark installed on the computer, you may download it from [Apache Spark Website](http://spark.apache.org/downloads.html).
```{r, eval=FALSE}
install.spark()
```
-If you already have Spark installed, you don't have to install again and can pass the `sparkHome` argument to `sparkR.session` to let SparkR know where the Spark installation is.
+If you already have Spark installed, you don't have to install again and can pass the `sparkHome` argument to `sparkR.session` to let SparkR know where the existing Spark installation is.
```{r, eval=FALSE}
sparkR.session(sparkHome = "/HOME/spark")