aboutsummaryrefslogtreecommitdiff
path: root/docs
diff options
context:
space:
mode:
authorPatrick Wendell <pwendell@gmail.com>2014-01-18 16:17:34 -0800
committerPatrick Wendell <pwendell@gmail.com>2014-01-18 16:20:00 -0800
commitbf5699543bf69fc850dbc2676caac97fa27818da (patch)
tree6a67ad6a1977c164b0f22d4206bb2248e073df18 /docs
parentaa981e4e97a11dbd5a4d012bfbdb395982968372 (diff)
downloadspark-bf5699543bf69fc850dbc2676caac97fa27818da.tar.gz
spark-bf5699543bf69fc850dbc2676caac97fa27818da.tar.bz2
spark-bf5699543bf69fc850dbc2676caac97fa27818da.zip
Merge pull request #462 from mateiz/conf-file-fix
Remove Typesafe Config usage and conf files to fix nested property names With Typesafe Config we had the subtle problem of no longer allowing nested property names, which are used for a few of our properties: http://apache-spark-developers-list.1001551.n3.nabble.com/Config-properties-broken-in-master-td208.html This PR is for branch 0.9 but should be added into master too. (cherry picked from commit 34e911ce9a9f91f3259189861779032069257852) Signed-off-by: Patrick Wendell <pwendell@gmail.com>
Diffstat (limited to 'docs')
-rw-r--r--docs/configuration.md28
1 files changed, 2 insertions, 26 deletions
diff --git a/docs/configuration.md b/docs/configuration.md
index da70cabba2..00864906b3 100644
--- a/docs/configuration.md
+++ b/docs/configuration.md
@@ -18,8 +18,8 @@ Spark provides three locations to configure the system:
Spark properties control most application settings and are configured separately for each application.
The preferred way to set them is by passing a [SparkConf](api/core/index.html#org.apache.spark.SparkConf)
class to your SparkContext constructor.
-Alternatively, Spark will also load them from Java system properties (for compatibility with old versions
-of Spark) and from a [`spark.conf` file](#configuration-files) on your classpath.
+Alternatively, Spark will also load them from Java system properties, for compatibility with old versions
+of Spark.
SparkConf lets you configure most of the common properties to initialize a cluster (e.g., master URL and
application name), as well as arbitrary key-value pairs through the `set()` method. For example, we could
@@ -468,30 +468,6 @@ Apart from these, the following properties are also available, and may be useful
The application web UI at `http://<driver>:4040` lists Spark properties in the "Environment" tab.
This is a useful place to check to make sure that your properties have been set correctly.
-## Configuration Files
-
-You can also configure Spark properties through a `spark.conf` file on your Java classpath.
-Because these properties are usually application-specific, we recommend putting this fine *only* on your
-application's classpath, and not in a global Spark classpath.
-
-The `spark.conf` file uses Typesafe Config's [HOCON format](https://github.com/typesafehub/config#json-superset),
-which is a superset of Java properties files and JSON. For example, the following is a simple config file:
-
-{% highlight awk %}
-# Comments are allowed
-spark.executor.memory = 512m
-spark.serializer = org.apache.spark.serializer.KryoSerializer
-{% endhighlight %}
-
-The format also allows hierarchical nesting, as follows:
-
-{% highlight awk %}
-spark.akka {
- threads = 8
- timeout = 200
-}
-{% endhighlight %}
-
# Environment Variables
Certain Spark settings can be configured through environment variables, which are read from the `conf/spark-env.sh`