diff options
author | Aaron Davidson <aaron@databricks.com> | 2014-03-09 11:08:39 -0700 |
---|---|---|
committer | Aaron Davidson <aaron@databricks.com> | 2014-03-09 11:08:39 -0700 |
commit | 52834d761b059264214dfc6a1f9c70b8bc7ec089 (patch) | |
tree | deadb9fd8330b40da0b455478c9319dd75421f58 /docs | |
parent | e59a3b6c415b95e8137f5a154716b12653a8aed0 (diff) | |
download | spark-52834d761b059264214dfc6a1f9c70b8bc7ec089.tar.gz spark-52834d761b059264214dfc6a1f9c70b8bc7ec089.tar.bz2 spark-52834d761b059264214dfc6a1f9c70b8bc7ec089.zip |
SPARK-929: Fully deprecate usage of SPARK_MEM
(Continued from old repo, prior discussion at https://github.com/apache/incubator-spark/pull/615)
This patch cements our deprecation of the SPARK_MEM environment variable by replacing it with three more specialized variables:
SPARK_DAEMON_MEMORY, SPARK_EXECUTOR_MEMORY, and SPARK_DRIVER_MEMORY
The creation of the latter two variables means that we can safely set driver/job memory without accidentally setting the executor memory. Neither is public.
SPARK_EXECUTOR_MEMORY is only used by the Mesos scheduler (and set within SparkContext). The proper way of configuring executor memory is through the "spark.executor.memory" property.
SPARK_DRIVER_MEMORY is the new way of specifying the amount of memory run by jobs launched by spark-class, without possibly affecting executor memory.
Other memory considerations:
- The repl's memory can be set through the "--drivermem" command-line option, which really just sets SPARK_DRIVER_MEMORY.
- run-example doesn't use spark-class, so the only way to modify examples' memory is actually an unusual use of SPARK_JAVA_OPTS (which is normally overriden in all cases by spark-class).
This patch also fixes a lurking bug where spark-shell misused spark-class (the first argument is supposed to be the main class name, not java options), as well as a bug in the Windows spark-class2.cmd. I have not yet tested this patch on either Windows or Mesos, however.
Author: Aaron Davidson <aaron@databricks.com>
Closes #99 from aarondav/sparkmem and squashes the following commits:
9df4c68 [Aaron Davidson] SPARK-929: Fully deprecate usage of SPARK_MEM
Diffstat (limited to 'docs')
-rw-r--r-- | docs/tuning.md | 2 |
1 files changed, 1 insertions, 1 deletions
diff --git a/docs/tuning.md b/docs/tuning.md index 26ff1325bb..093df3187a 100644 --- a/docs/tuning.md +++ b/docs/tuning.md @@ -163,7 +163,7 @@ their work directories), *not* on your driver program. **Cache Size Tuning** One important configuration parameter for GC is the amount of memory that should be used for caching RDDs. -By default, Spark uses 60% of the configured executor memory (`spark.executor.memory` or `SPARK_MEM`) to +By default, Spark uses 60% of the configured executor memory (`spark.executor.memory`) to cache RDDs. This means that 40% of memory is available for any objects created during task execution. In case your tasks slow down and you find that your JVM is garbage-collecting frequently or running out of |