aboutsummaryrefslogtreecommitdiff
path: root/python
diff options
context:
space:
mode:
authorPatrick Wendell <pwendell@gmail.com>2014-01-18 16:17:34 -0800
committerPatrick Wendell <pwendell@gmail.com>2014-01-18 16:20:00 -0800
commitbf5699543bf69fc850dbc2676caac97fa27818da (patch)
tree6a67ad6a1977c164b0f22d4206bb2248e073df18 /python
parentaa981e4e97a11dbd5a4d012bfbdb395982968372 (diff)
downloadspark-bf5699543bf69fc850dbc2676caac97fa27818da.tar.gz
spark-bf5699543bf69fc850dbc2676caac97fa27818da.tar.bz2
spark-bf5699543bf69fc850dbc2676caac97fa27818da.zip
Merge pull request #462 from mateiz/conf-file-fix
Remove Typesafe Config usage and conf files to fix nested property names With Typesafe Config we had the subtle problem of no longer allowing nested property names, which are used for a few of our properties: http://apache-spark-developers-list.1001551.n3.nabble.com/Config-properties-broken-in-master-td208.html This PR is for branch 0.9 but should be added into master too. (cherry picked from commit 34e911ce9a9f91f3259189861779032069257852) Signed-off-by: Patrick Wendell <pwendell@gmail.com>
Diffstat (limited to 'python')
-rw-r--r--python/pyspark/conf.py10
1 files changed, 4 insertions, 6 deletions
diff --git a/python/pyspark/conf.py b/python/pyspark/conf.py
index d72aed6a30..3870cd8f2b 100644
--- a/python/pyspark/conf.py
+++ b/python/pyspark/conf.py
@@ -61,14 +61,12 @@ class SparkConf(object):
Most of the time, you would create a SparkConf object with
C{SparkConf()}, which will load values from C{spark.*} Java system
- properties and any C{spark.conf} on your Spark classpath. In this
- case, system properties take priority over C{spark.conf}, and any
- parameters you set directly on the C{SparkConf} object take priority
- over both of those.
+ properties as well. In this case, any parameters you set directly on
+ the C{SparkConf} object take priority over system properties.
For unit tests, you can also call C{SparkConf(false)} to skip
loading external settings and get the same configuration no matter
- what is on the classpath.
+ what the system properties are.
All setter methods in this class support chaining. For example,
you can write C{conf.setMaster("local").setAppName("My app")}.
@@ -82,7 +80,7 @@ class SparkConf(object):
Create a new Spark configuration.
@param loadDefaults: whether to load values from Java system
- properties and classpath (True by default)
+ properties (True by default)
@param _jvm: internal parameter used to pass a handle to the
Java VM; does not need to be set by users
"""