aboutsummaryrefslogtreecommitdiff
path: root/docs/configuration.md
diff options
context:
space:
mode:
authorIlya Ganelin <ilya.ganelin@capitalone.com>2015-07-01 23:11:02 -0700
committerAndrew Or <andrew@databricks.com>2015-07-01 23:11:02 -0700
commit3697232b7d438979cc119b2a364296b0eec4a16a (patch)
tree178437d6c6d5ffac1560ce96d1d89f3ca0ba805e /docs/configuration.md
parent377ff4c9e8942882183d94698684824e9dc9f391 (diff)
downloadspark-3697232b7d438979cc119b2a364296b0eec4a16a.tar.gz
spark-3697232b7d438979cc119b2a364296b0eec4a16a.tar.bz2
spark-3697232b7d438979cc119b2a364296b0eec4a16a.zip
[SPARK-3071] Increase default driver memory
I've updated default values in comments, documentation, and in the command line builder to be 1g based on comments in the JIRA. I've also updated most usages to point at a single variable defined in the Utils.scala and JavaUtils.java files. This wasn't possible in all cases (R, shell scripts etc.) but usage in most code is now pointing at the same place. Please let me know if I've missed anything. Will the spark-shell use the value within the command line builder during instantiation? Author: Ilya Ganelin <ilya.ganelin@capitalone.com> Closes #7132 from ilganeli/SPARK-3071 and squashes the following commits: 4074164 [Ilya Ganelin] String fix 271610b [Ilya Ganelin] Merge branch 'SPARK-3071' of github.com:ilganeli/spark into SPARK-3071 273b6e9 [Ilya Ganelin] Test fix fd67721 [Ilya Ganelin] Update JavaUtils.java 26cc177 [Ilya Ganelin] test fix e5db35d [Ilya Ganelin] Fixed test failure 39732a1 [Ilya Ganelin] merge fix a6f7deb [Ilya Ganelin] Created default value for DRIVER MEM in Utils that's now used in almost all locations instead of setting manually in each 09ad698 [Ilya Ganelin] Update SubmitRestProtocolSuite.scala 19b6f25 [Ilya Ganelin] Missed one doc update 2698a3d [Ilya Ganelin] Updated default value for driver memory
Diffstat (limited to 'docs/configuration.md')
-rw-r--r--docs/configuration.md4
1 files changed, 2 insertions, 2 deletions
diff --git a/docs/configuration.md b/docs/configuration.md
index affcd21514..bebaf6f62e 100644
--- a/docs/configuration.md
+++ b/docs/configuration.md
@@ -137,10 +137,10 @@ of the most common options to set are:
</tr>
<tr>
<td><code>spark.driver.memory</code></td>
- <td>512m</td>
+ <td>1g</td>
<td>
Amount of memory to use for the driver process, i.e. where SparkContext is initialized.
- (e.g. <code>512m</code>, <code>2g</code>).
+ (e.g. <code>1g</code>, <code>2g</code>).
<br /><em>Note:</em> In client mode, this config must not be set through the <code>SparkConf</code>
directly in your application, because the driver JVM has already started at that point.