aboutsummaryrefslogtreecommitdiff
path: root/core
diff options
context:
space:
mode:
authorMichael Sannella x268 <msannell@tibco.com>2015-06-29 17:28:28 -0700
committerAndrew Or <andrew@databricks.com>2015-06-29 17:28:28 -0700
commit4a9e03fa850af9e4ee56d011671faa04fb601170 (patch)
tree6e3443f4a1cfcb5f4e2b687248bf80a4400ed7b6 /core
parentd7f796da45d9a7c76ee4c29a9e0661ef76d8028a (diff)
downloadspark-4a9e03fa850af9e4ee56d011671faa04fb601170.tar.gz
spark-4a9e03fa850af9e4ee56d011671faa04fb601170.tar.bz2
spark-4a9e03fa850af9e4ee56d011671faa04fb601170.zip
[SPARK-8019] [SPARKR] Support SparkR spawning worker R processes with a command other then Rscript
This is a simple change to add a new environment variable "spark.sparkr.r.command" that specifies the command that SparkR will use when creating an R engine process. If this is not specified, "Rscript" will be used by default. I did not add any documentation, since I couldn't find any place where environment variables (such as "spark.sparkr.use.daemon") are documented. I also did not add a unit test. The only test that would work generally would be one starting SparkR with sparkR.init(sparkEnvir=list(spark.sparkr.r.command="Rscript")), just using the default value. I think that this is a low-risk change. Likely committers: shivaram Author: Michael Sannella x268 <msannell@tibco.com> Closes #6557 from msannell/altR and squashes the following commits: 7eac142 [Michael Sannella x268] add spark.sparkr.r.command config parameter
Diffstat (limited to 'core')
-rw-r--r--core/src/main/scala/org/apache/spark/api/r/RRDD.scala2
1 files changed, 1 insertions, 1 deletions
diff --git a/core/src/main/scala/org/apache/spark/api/r/RRDD.scala b/core/src/main/scala/org/apache/spark/api/r/RRDD.scala
index 4dfa732593..524676544d 100644
--- a/core/src/main/scala/org/apache/spark/api/r/RRDD.scala
+++ b/core/src/main/scala/org/apache/spark/api/r/RRDD.scala
@@ -391,7 +391,7 @@ private[r] object RRDD {
}
private def createRProcess(rLibDir: String, port: Int, script: String): BufferedStreamThread = {
- val rCommand = "Rscript"
+ val rCommand = SparkEnv.get.conf.get("spark.sparkr.r.command", "Rscript")
val rOptions = "--vanilla"
val rExecScript = rLibDir + "/SparkR/worker/" + script
val pb = new ProcessBuilder(List(rCommand, rOptions, rExecScript))