aboutsummaryrefslogtreecommitdiff
path: root/dev
diff options
context:
space:
mode:
authorFelix Cheung <felixcheung_m@hotmail.com>2016-12-08 11:29:31 -0800
committerShivaram Venkataraman <shivaram@cs.berkeley.edu>2016-12-08 11:29:31 -0800
commitc3d3a9d0e85b834abef87069e4edd27db87fc607 (patch)
treebcb6a0d4e506d6ddce58cf927e2e6b8e85970045 /dev
parent3c68944b229aaaeeaee3efcbae3e3be9a2914855 (diff)
downloadspark-c3d3a9d0e85b834abef87069e4edd27db87fc607.tar.gz
spark-c3d3a9d0e85b834abef87069e4edd27db87fc607.tar.bz2
spark-c3d3a9d0e85b834abef87069e4edd27db87fc607.zip
[SPARK-18590][SPARKR] build R source package when making distribution
## What changes were proposed in this pull request? This PR has 2 key changes. One, we are building source package (aka bundle package) for SparkR which could be released on CRAN. Two, we should include in the official Spark binary distributions SparkR installed from this source package instead (which would have help/vignettes rds needed for those to work when the SparkR package is loaded in R, whereas earlier approach with devtools does not) But, because of various differences in how R performs different tasks, this PR is a fair bit more complicated. More details below. This PR also includes a few minor fixes. ### more details These are the additional steps in make-distribution; please see [here](https://github.com/apache/spark/blob/master/R/CRAN_RELEASE.md) on what's going to a CRAN release, which is now run during make-distribution.sh. 1. package needs to be installed because the first code block in vignettes is `library(SparkR)` without lib path 2. `R CMD build` will build vignettes (this process runs Spark/SparkR code and captures outputs into pdf documentation) 3. `R CMD check` on the source package will install package and build vignettes again (this time from source packaged) - this is a key step required to release R package on CRAN (will skip tests here but tests will need to pass for CRAN release process to success - ideally, during release signoff we should install from the R source package and run tests) 4. `R CMD Install` on the source package (this is the only way to generate doc/vignettes rds files correctly, not in step # 1) (the output of this step is what we package into Spark dist and sparkr.zip) Alternatively, R CMD build should already be installing the package in a temp directory though it might just be finding this location and set it to lib.loc parameter; another approach is perhaps we could try calling `R CMD INSTALL --build pkg` instead. But in any case, despite installing the package multiple times this is relatively fast. Building vignettes takes a while though. ## How was this patch tested? Manually, CI. Author: Felix Cheung <felixcheung_m@hotmail.com> Closes #16014 from felixcheung/rdist.
Diffstat (limited to 'dev')
-rwxr-xr-xdev/create-release/release-build.sh27
-rwxr-xr-xdev/make-distribution.sh25
2 files changed, 44 insertions, 8 deletions
diff --git a/dev/create-release/release-build.sh b/dev/create-release/release-build.sh
index aa42750f26..8863ee6cd7 100755
--- a/dev/create-release/release-build.sh
+++ b/dev/create-release/release-build.sh
@@ -150,7 +150,7 @@ if [[ "$1" == "package" ]]; then
NAME=$1
FLAGS=$2
ZINC_PORT=$3
- BUILD_PIP_PACKAGE=$4
+ BUILD_PACKAGE=$4
cp -r spark spark-$SPARK_VERSION-bin-$NAME
cd spark-$SPARK_VERSION-bin-$NAME
@@ -172,11 +172,30 @@ if [[ "$1" == "package" ]]; then
MVN_HOME=`$MVN -version 2>&1 | grep 'Maven home' | awk '{print $NF}'`
- if [ -z "$BUILD_PIP_PACKAGE" ]; then
- echo "Creating distribution without PIP package"
+ if [ -z "$BUILD_PACKAGE" ]; then
+ echo "Creating distribution without PIP/R package"
./dev/make-distribution.sh --name $NAME --mvn $MVN_HOME/bin/mvn --tgz $FLAGS \
-DzincPort=$ZINC_PORT 2>&1 > ../binary-release-$NAME.log
cd ..
+ elif [[ "$BUILD_PACKAGE" == "withr" ]]; then
+ echo "Creating distribution with R package"
+ ./dev/make-distribution.sh --name $NAME --mvn $MVN_HOME/bin/mvn --tgz --r $FLAGS \
+ -DzincPort=$ZINC_PORT 2>&1 > ../binary-release-$NAME.log
+ cd ..
+
+ echo "Copying and signing R source package"
+ R_DIST_NAME=SparkR_$SPARK_VERSION.tar.gz
+ cp spark-$SPARK_VERSION-bin-$NAME/R/$R_DIST_NAME .
+
+ echo $GPG_PASSPHRASE | $GPG --passphrase-fd 0 --armour \
+ --output $R_DIST_NAME.asc \
+ --detach-sig $R_DIST_NAME
+ echo $GPG_PASSPHRASE | $GPG --passphrase-fd 0 --print-md \
+ MD5 $R_DIST_NAME > \
+ $R_DIST_NAME.md5
+ echo $GPG_PASSPHRASE | $GPG --passphrase-fd 0 --print-md \
+ SHA512 $R_DIST_NAME > \
+ $R_DIST_NAME.sha
else
echo "Creating distribution with PIP package"
./dev/make-distribution.sh --name $NAME --mvn $MVN_HOME/bin/mvn --tgz --pip $FLAGS \
@@ -222,7 +241,7 @@ if [[ "$1" == "package" ]]; then
make_binary_release "hadoop2.6" "-Phadoop-2.6 $FLAGS" "3035" &
make_binary_release "hadoop2.7" "-Phadoop-2.7 $FLAGS" "3036" "withpip" &
make_binary_release "hadoop2.4-without-hive" "-Psparkr -Phadoop-2.4 -Pyarn -Pmesos" "3037" &
- make_binary_release "without-hadoop" "-Psparkr -Phadoop-provided -Pyarn -Pmesos" "3038" &
+ make_binary_release "without-hadoop" "-Psparkr -Phadoop-provided -Pyarn -Pmesos" "3038" "withr" &
wait
rm -rf spark-$SPARK_VERSION-bin-*/
diff --git a/dev/make-distribution.sh b/dev/make-distribution.sh
index 49b46fbc3f..fe281bbaa2 100755
--- a/dev/make-distribution.sh
+++ b/dev/make-distribution.sh
@@ -34,6 +34,7 @@ DISTDIR="$SPARK_HOME/dist"
MAKE_TGZ=false
MAKE_PIP=false
+MAKE_R=false
NAME=none
MVN="$SPARK_HOME/build/mvn"
@@ -41,7 +42,7 @@ function exit_with_usage {
echo "make-distribution.sh - tool for making binary distributions of Spark"
echo ""
echo "usage:"
- cl_options="[--name] [--tgz] [--pip] [--mvn <mvn-command>]"
+ cl_options="[--name] [--tgz] [--pip] [--r] [--mvn <mvn-command>]"
echo "make-distribution.sh $cl_options <maven build options>"
echo "See Spark's \"Building Spark\" doc for correct Maven options."
echo ""
@@ -71,6 +72,9 @@ while (( "$#" )); do
--pip)
MAKE_PIP=true
;;
+ --r)
+ MAKE_R=true
+ ;;
--mvn)
MVN="$2"
shift
@@ -208,11 +212,24 @@ cp -r "$SPARK_HOME/data" "$DISTDIR"
# Make pip package
if [ "$MAKE_PIP" == "true" ]; then
echo "Building python distribution package"
- cd $SPARK_HOME/python
+ pushd "$SPARK_HOME/python" > /dev/null
python setup.py sdist
- cd ..
+ popd > /dev/null
+else
+ echo "Skipping building python distribution package"
+fi
+
+# Make R package - this is used for both CRAN release and packing R layout into distribution
+if [ "$MAKE_R" == "true" ]; then
+ echo "Building R source package"
+ pushd "$SPARK_HOME/R" > /dev/null
+ # Build source package and run full checks
+ # Install source package to get it to generate vignettes, etc.
+ # Do not source the check-cran.sh - it should be run from where it is for it to set SPARK_HOME
+ NO_TESTS=1 CLEAN_INSTALL=1 "$SPARK_HOME/"R/check-cran.sh
+ popd > /dev/null
else
- echo "Skipping creating pip installable PySpark"
+ echo "Skipping building R source package"
fi
# Copy other things