diff options
author | 郭小龙 10207633 <guo.xiaolong1@zte.com.cn> | 2017-04-21 20:08:26 +0100 |
---|---|---|
committer | Sean Owen <sowen@cloudera.com> | 2017-04-21 20:08:26 +0100 |
commit | ad290402aa1d609abf5a2883a6d87fa8bc2bd517 (patch) | |
tree | 5f0e7d0ba7dea37827755221b5a0798fb118adc8 | |
parent | fd648bff63f91a30810910dfc5664eea0ff5e6f9 (diff) | |
download | spark-ad290402aa1d609abf5a2883a6d87fa8bc2bd517.tar.gz spark-ad290402aa1d609abf5a2883a6d87fa8bc2bd517.tar.bz2 spark-ad290402aa1d609abf5a2883a6d87fa8bc2bd517.zip |
[SPARK-20401][DOC] In the spark official configuration document, the 'spark.driver.supervise' configuration parameter specification and default values are necessary.
## What changes were proposed in this pull request?
Use the REST interface submits the spark job.
e.g.
curl -X POST http://10.43.183.120:6066/v1/submissions/create --header "Content-Type:application/json;charset=UTF-8" --data'{
"action": "CreateSubmissionRequest",
"appArgs": [
"myAppArgument"
],
"appResource": "/home/mr/gxl/test.jar",
"clientSparkVersion": "2.2.0",
"environmentVariables": {
"SPARK_ENV_LOADED": "1"
},
"mainClass": "cn.zte.HdfsTest",
"sparkProperties": {
"spark.jars": "/home/mr/gxl/test.jar",
**"spark.driver.supervise": "true",**
"spark.app.name": "HdfsTest",
"spark.eventLog.enabled": "false",
"spark.submit.deployMode": "cluster",
"spark.master": "spark://10.43.183.120:6066"
}
}'
**I hope that make sure that the driver is automatically restarted if it fails with non-zero exit code.
But I can not find the 'spark.driver.supervise' configuration parameter specification and default values from the spark official document.**
## How was this patch tested?
manual tests
Please review http://spark.apache.org/contributing.html before opening a pull request.
Author: 郭小龙 10207633 <guo.xiaolong1@zte.com.cn>
Author: guoxiaolong <guo.xiaolong1@zte.com.cn>
Author: guoxiaolongzte <guo.xiaolong1@zte.com.cn>
Closes #17696 from guoxiaolongzte/SPARK-20401.
-rw-r--r-- | docs/configuration.md | 8 |
1 files changed, 8 insertions, 0 deletions
diff --git a/docs/configuration.md b/docs/configuration.md index 2687f542b8..6b65d2bcb8 100644 --- a/docs/configuration.md +++ b/docs/configuration.md @@ -213,6 +213,14 @@ of the most common options to set are: and typically can have up to 50 characters. </td> </tr> +<tr> + <td><code>spark.driver.supervise</code></td> + <td>false</td> + <td> + If true, restarts the driver automatically if it fails with a non-zero exit status. + Only has effect in Spark standalone mode or Mesos cluster deploy mode. + </td> +</tr> </table> Apart from these, the following properties are also available, and may be useful in some situations: |