diff options
author | Matei Zaharia <matei@eecs.berkeley.edu> | 2013-07-16 17:30:15 -0700 |
---|---|---|
committer | Matei Zaharia <matei@eecs.berkeley.edu> | 2013-07-16 17:30:15 -0700 |
commit | 87d586e4da63e6e1875d9cac194c6f11e1cdc653 (patch) | |
tree | 3b2ed3203b6abb94a3a853a7b95dfebaa67665a1 /docs/tuning.md | |
parent | d733527bb4dad14b276b4f56b1ff5c3ee1cb7f75 (diff) | |
parent | 4ff494de20c36151dc29a60825d67e094d14acd4 (diff) | |
download | spark-87d586e4da63e6e1875d9cac194c6f11e1cdc653.tar.gz spark-87d586e4da63e6e1875d9cac194c6f11e1cdc653.tar.bz2 spark-87d586e4da63e6e1875d9cac194c6f11e1cdc653.zip |
Merge remote-tracking branch 'old/master'
Diffstat (limited to 'docs/tuning.md')
-rw-r--r-- | docs/tuning.md | 6 |
1 files changed, 3 insertions, 3 deletions
diff --git a/docs/tuning.md b/docs/tuning.md index 32c7ab86e9..5ffca54481 100644 --- a/docs/tuning.md +++ b/docs/tuning.md @@ -157,9 +157,9 @@ their work directories), *not* on your driver program. **Cache Size Tuning** -One important configuration parameter for GC is the amount of memory that should be used for -caching RDDs. By default, Spark uses 66% of the configured memory (`SPARK_MEM`) to cache RDDs. This means that - 33% of memory is available for any objects created during task execution. +One important configuration parameter for GC is the amount of memory that should be used for caching RDDs. +By default, Spark uses 66% of the configured executor memory (`spark.executor.memory` or `SPARK_MEM`) to +cache RDDs. This means that 33% of memory is available for any objects created during task execution. In case your tasks slow down and you find that your JVM is garbage-collecting frequently or running out of memory, lowering this value will help reduce the memory consumption. To change this to say 50%, you can call |