aboutsummaryrefslogtreecommitdiff
path: root/R/README.md
diff options
context:
space:
mode:
authorFelix Cheung <felixcheung_m@hotmail.com>2016-11-11 15:49:55 -0800
committerShivaram Venkataraman <shivaram@cs.berkeley.edu>2016-11-11 15:49:55 -0800
commitba23f768f7419039df85530b84258ec31f0c22b4 (patch)
tree20d35c6fcf8fad0231e8120dee6dbfbf0333c2b6 /R/README.md
parent6e95325fc3726d260054bd6e7c0717b3c139917e (diff)
downloadspark-ba23f768f7419039df85530b84258ec31f0c22b4.tar.gz
spark-ba23f768f7419039df85530b84258ec31f0c22b4.tar.bz2
spark-ba23f768f7419039df85530b84258ec31f0c22b4.zip
[SPARK-18264][SPARKR] build vignettes with package, update vignettes for CRAN release build and add info on release
## What changes were proposed in this pull request? Changes to DESCRIPTION to build vignettes. Changes the metadata for vignettes to generate the recommended format (which is about <10% of size before). Unfortunately it does not look as nice (before - left, after - right) ![image](https://cloud.githubusercontent.com/assets/8969467/20040492/b75883e6-a40d-11e6-9534-25cdd5d59a8b.png) ![image](https://cloud.githubusercontent.com/assets/8969467/20040490/a40f4d42-a40d-11e6-8c91-af00ddcbdad9.png) Also add information on how to run build/release to CRAN later. ## How was this patch tested? manually, unit tests shivaram We need this for branch-2.1 Author: Felix Cheung <felixcheung_m@hotmail.com> Closes #15790 from felixcheung/rpkgvignettes.
Diffstat (limited to 'R/README.md')
-rw-r--r--R/README.md8
1 files changed, 4 insertions, 4 deletions
diff --git a/R/README.md b/R/README.md
index 932d5272d0..47f9a86dfd 100644
--- a/R/README.md
+++ b/R/README.md
@@ -6,7 +6,7 @@ SparkR is an R package that provides a light-weight frontend to use Spark from R
Libraries of sparkR need to be created in `$SPARK_HOME/R/lib`. This can be done by running the script `$SPARK_HOME/R/install-dev.sh`.
By default the above script uses the system wide installation of R. However, this can be changed to any user installed location of R by setting the environment variable `R_HOME` the full path of the base directory where R is installed, before running install-dev.sh script.
-Example:
+Example:
```bash
# where /home/username/R is where R is installed and /home/username/R/bin contains the files R and RScript
export R_HOME=/home/username/R
@@ -46,7 +46,7 @@ Sys.setenv(SPARK_HOME="/Users/username/spark")
# This line loads SparkR from the installed directory
.libPaths(c(file.path(Sys.getenv("SPARK_HOME"), "R", "lib"), .libPaths()))
library(SparkR)
-sc <- sparkR.init(master="local")
+sparkR.session()
```
#### Making changes to SparkR
@@ -54,11 +54,11 @@ sc <- sparkR.init(master="local")
The [instructions](https://cwiki.apache.org/confluence/display/SPARK/Contributing+to+Spark) for making contributions to Spark also apply to SparkR.
If you only make R file changes (i.e. no Scala changes) then you can just re-install the R package using `R/install-dev.sh` and test your changes.
Once you have made your changes, please include unit tests for them and run existing unit tests using the `R/run-tests.sh` script as described below.
-
+
#### Generating documentation
The SparkR documentation (Rd files and HTML files) are not a part of the source repository. To generate them you can run the script `R/create-docs.sh`. This script uses `devtools` and `knitr` to generate the docs and these packages need to be installed on the machine before using the script. Also, you may need to install these [prerequisites](https://github.com/apache/spark/tree/master/docs#prerequisites). See also, `R/DOCUMENTATION.md`
-
+
### Examples, Unit tests
SparkR comes with several sample programs in the `examples/src/main/r` directory.