diff options
author | Junyang Qian <junyangq@databricks.com> | 2016-08-10 11:18:23 -0700 |
---|---|---|
committer | Shivaram Venkataraman <shivaram@cs.berkeley.edu> | 2016-08-10 11:18:23 -0700 |
commit | 214ba66a030bc3a718c567a742b0db44bf911d61 (patch) | |
tree | 08df168f2981e89321fbbb7418c1690c7cfc6958 /docs/running-on-yarn.md | |
parent | d4a9122430d6c3aeaaee32aa09d314016ff6ddc7 (diff) | |
download | spark-214ba66a030bc3a718c567a742b0db44bf911d61.tar.gz spark-214ba66a030bc3a718c567a742b0db44bf911d61.tar.bz2 spark-214ba66a030bc3a718c567a742b0db44bf911d61.zip |
[SPARK-16579][SPARKR] add install.spark function
## What changes were proposed in this pull request?
Add an install_spark function to the SparkR package. User can run `install_spark()` to install Spark to a local directory within R.
Updates:
Several changes have been made:
- `install.spark()`
- check existence of tar file in the cache folder, and download only if not found
- trial priority of mirror_url look-up: user-provided -> preferred mirror site from apache website -> hardcoded backup option
- use 2.0.0
- `sparkR.session()`
- can install spark when not found in `SPARK_HOME`
## How was this patch tested?
Manual tests, running the check-cran.sh script added in #14173.
Author: Junyang Qian <junyangq@databricks.com>
Closes #14258 from junyangq/SPARK-16579.
Diffstat (limited to 'docs/running-on-yarn.md')
0 files changed, 0 insertions, 0 deletions