aboutsummaryrefslogtreecommitdiff
path: root/R/WINDOWS.md
diff options
context:
space:
mode:
authorhyukjinkwon <gurwls223@gmail.com>2016-05-23 17:20:29 -0700
committerShivaram Venkataraman <shivaram@cs.berkeley.edu>2016-05-23 17:20:29 -0700
commita8e97d17b91684e68290d9f18a43622232aa94e7 (patch)
tree6eda498a8c24f59e3863bd14a31ef85df4677cf1 /R/WINDOWS.md
parent03c7b7c4b9374f0cb6a29aeaf495bd21c2563de4 (diff)
downloadspark-a8e97d17b91684e68290d9f18a43622232aa94e7.tar.gz
spark-a8e97d17b91684e68290d9f18a43622232aa94e7.tar.bz2
spark-a8e97d17b91684e68290d9f18a43622232aa94e7.zip
[MINOR][SPARKR][DOC] Add a description for running unit tests in Windows
## What changes were proposed in this pull request? This PR adds the description for running unit tests in Windows. ## How was this patch tested? On a bare machine (Window 7, 32bits), this was manually built and tested. Author: hyukjinkwon <gurwls223@gmail.com> Closes #13217 from HyukjinKwon/minor-r-doc.
Diffstat (limited to 'R/WINDOWS.md')
-rw-r--r--R/WINDOWS.md20
1 files changed, 20 insertions, 0 deletions
diff --git a/R/WINDOWS.md b/R/WINDOWS.md
index 3f889c0ca3..f948ed3974 100644
--- a/R/WINDOWS.md
+++ b/R/WINDOWS.md
@@ -11,3 +11,23 @@ include Rtools and R in `PATH`.
directory in Maven in `PATH`.
4. Set `MAVEN_OPTS` as described in [Building Spark](http://spark.apache.org/docs/latest/building-spark.html).
5. Open a command shell (`cmd`) in the Spark directory and run `mvn -DskipTests -Psparkr package`
+
+## Unit tests
+
+To run the SparkR unit tests on Windows, the following steps are required —assuming you are in the Spark root directory and do not have Apache Hadoop installed already:
+
+1. Create a folder to download Hadoop related files for Windows. For example, `cd ..` and `mkdir hadoop`.
+
+2. Download the relevant Hadoop bin package from [steveloughran/winutils](https://github.com/steveloughran/winutils). While these are not official ASF artifacts, they are built from the ASF release git hashes by a Hadoop PMC member on a dedicated Windows VM. For further reading, consult [Windows Problems on the Hadoop wiki](https://wiki.apache.org/hadoop/WindowsProblems).
+
+3. Install the files into `hadoop\bin`; make sure that `winutils.exe` and `hadoop.dll` are present.
+
+4. Set the environment variable `HADOOP_HOME` to the full path to the newly created `hadoop` directory.
+
+5. Run unit tests for SparkR by running the command below. You need to install the [testthat](http://cran.r-project.org/web/packages/testthat/index.html) package first:
+
+ ```
+ R -e "install.packages('testthat', repos='http://cran.us.r-project.org')"
+ .\bin\spark-submit2.cmd --conf spark.hadoop.fs.defualt.name="file:///" R\pkg\tests\run-all.R
+ ```
+