From bf5699543bf69fc850dbc2676caac97fa27818da Mon Sep 17 00:00:00 2001 From: Patrick Wendell Date: Sat, 18 Jan 2014 16:17:34 -0800 Subject: Merge pull request #462 from mateiz/conf-file-fix Remove Typesafe Config usage and conf files to fix nested property names With Typesafe Config we had the subtle problem of no longer allowing nested property names, which are used for a few of our properties: http://apache-spark-developers-list.1001551.n3.nabble.com/Config-properties-broken-in-master-td208.html This PR is for branch 0.9 but should be added into master too. (cherry picked from commit 34e911ce9a9f91f3259189861779032069257852) Signed-off-by: Patrick Wendell --- docs/configuration.md | 28 ++-------------------------- 1 file changed, 2 insertions(+), 26 deletions(-) (limited to 'docs/configuration.md') diff --git a/docs/configuration.md b/docs/configuration.md index da70cabba2..00864906b3 100644 --- a/docs/configuration.md +++ b/docs/configuration.md @@ -18,8 +18,8 @@ Spark provides three locations to configure the system: Spark properties control most application settings and are configured separately for each application. The preferred way to set them is by passing a [SparkConf](api/core/index.html#org.apache.spark.SparkConf) class to your SparkContext constructor. -Alternatively, Spark will also load them from Java system properties (for compatibility with old versions -of Spark) and from a [`spark.conf` file](#configuration-files) on your classpath. +Alternatively, Spark will also load them from Java system properties, for compatibility with old versions +of Spark. SparkConf lets you configure most of the common properties to initialize a cluster (e.g., master URL and application name), as well as arbitrary key-value pairs through the `set()` method. For example, we could @@ -468,30 +468,6 @@ Apart from these, the following properties are also available, and may be useful The application web UI at `http://:4040` lists Spark properties in the "Environment" tab. This is a useful place to check to make sure that your properties have been set correctly. -## Configuration Files - -You can also configure Spark properties through a `spark.conf` file on your Java classpath. -Because these properties are usually application-specific, we recommend putting this fine *only* on your -application's classpath, and not in a global Spark classpath. - -The `spark.conf` file uses Typesafe Config's [HOCON format](https://github.com/typesafehub/config#json-superset), -which is a superset of Java properties files and JSON. For example, the following is a simple config file: - -{% highlight awk %} -# Comments are allowed -spark.executor.memory = 512m -spark.serializer = org.apache.spark.serializer.KryoSerializer -{% endhighlight %} - -The format also allows hierarchical nesting, as follows: - -{% highlight awk %} -spark.akka { - threads = 8 - timeout = 200 -} -{% endhighlight %} - # Environment Variables Certain Spark settings can be configured through environment variables, which are read from the `conf/spark-env.sh` -- cgit v1.2.3