diff options
author | Andrew Or <andrewor14@gmail.com> | 2014-08-28 11:05:44 -0700 |
---|---|---|
committer | Andrew Or <andrewor14@gmail.com> | 2014-08-28 11:05:44 -0700 |
commit | 41dc5987d9abeca6fc0f5935c780d48f517cdf95 (patch) | |
tree | ba7ec3f43526910f197f090664e7e496e4755fda /docs | |
parent | 6d392b36ee1dc6f7e5198dd436e4e62eb816a072 (diff) | |
download | spark-41dc5987d9abeca6fc0f5935c780d48f517cdf95.tar.gz spark-41dc5987d9abeca6fc0f5935c780d48f517cdf95.tar.bz2 spark-41dc5987d9abeca6fc0f5935c780d48f517cdf95.zip |
[SPARK-3264] Allow users to set executor Spark home in Mesos
The executors and the driver may not share the same Spark home. There is currently one way to set the executor side Spark home in Mesos, through setting `spark.home`. However, this is neither documented nor intuitive. This PR adds a more specific config `spark.mesos.executor.home` and exposes this to the user.
liancheng tnachen
Author: Andrew Or <andrewor14@gmail.com>
Closes #2166 from andrewor14/mesos-spark-home and squashes the following commits:
b87965e [Andrew Or] Merge branch 'master' of github.com:apache/spark into mesos-spark-home
f6abb2e [Andrew Or] Document spark.mesos.executor.home
ca7846d [Andrew Or] Add more specific configuration for executor Spark home in Mesos
Diffstat (limited to 'docs')
-rw-r--r-- | docs/configuration.md | 10 |
1 files changed, 10 insertions, 0 deletions
diff --git a/docs/configuration.md b/docs/configuration.md index 981170d8b4..65a422caab 100644 --- a/docs/configuration.md +++ b/docs/configuration.md @@ -214,6 +214,16 @@ Apart from these, the following properties are also available, and may be useful process. The user can specify multiple of these and to set multiple environment variables. </td> </tr> +<tr> + <td><code>spark.mesos.executor.home</code></td> + <td>driver side <code>SPARK_HOME</code></td> + <td> + Set the directory in which Spark is installed on the executors in Mesos. By default, the + executors will simply use the driver's Spark home directory, which may not be visible to + them. Note that this is only relevant if a Spark binary package is not specified through + <code>spark.executor.uri</code>. + </td> +</tr> </table> #### Shuffle Behavior |