diff options
author | bomeng <bmeng@us.ibm.com> | 2016-06-12 12:58:34 +0100 |
---|---|---|
committer | Sean Owen <sowen@cloudera.com> | 2016-06-12 12:58:34 +0100 |
commit | 3fd3ee038b89821f51f30a4ecd4452b5b3bc6568 (patch) | |
tree | 06e5851175b71fa35f22db6376322a33530c1545 | |
parent | 8cc22b0085475a188f229536b4f83988ae889a8e (diff) | |
download | spark-3fd3ee038b89821f51f30a4ecd4452b5b3bc6568.tar.gz spark-3fd3ee038b89821f51f30a4ecd4452b5b3bc6568.tar.bz2 spark-3fd3ee038b89821f51f30a4ecd4452b5b3bc6568.zip |
[SPARK-15781][DOCUMENTATION] remove deprecated environment variable doc
## What changes were proposed in this pull request?
Like `SPARK_JAVA_OPTS` and `SPARK_CLASSPATH`, we will remove the document for `SPARK_WORKER_INSTANCES` to discourage user not to use them. If they are actually used, SparkConf will show a warning message as before.
## How was this patch tested?
Manually tested.
Author: bomeng <bmeng@us.ibm.com>
Closes #13533 from bomeng/SPARK-15781.
-rw-r--r-- | docs/spark-standalone.md | 9 |
1 files changed, 0 insertions, 9 deletions
diff --git a/docs/spark-standalone.md b/docs/spark-standalone.md index fd94c34d16..40c72931cb 100644 --- a/docs/spark-standalone.md +++ b/docs/spark-standalone.md @@ -134,15 +134,6 @@ You can optionally configure the cluster further by setting environment variable <td>Port for the worker web UI (default: 8081).</td> </tr> <tr> - <td><code>SPARK_WORKER_INSTANCES</code></td> - <td> - Number of worker instances to run on each machine (default: 1). You can make this more than 1 if - you have have very large machines and would like multiple Spark worker processes. If you do set - this, make sure to also set <code>SPARK_WORKER_CORES</code> explicitly to limit the cores per worker, - or else each worker will try to use all the cores. - </td> - </tr> - <tr> <td><code>SPARK_WORKER_DIR</code></td> <td>Directory to run applications in, which will include both logs and scratch space (default: SPARK_HOME/work).</td> </tr> |