diff options
author | Patrick Wendell <pwendell@gmail.com> | 2014-03-09 11:57:06 -0700 |
---|---|---|
committer | Patrick Wendell <pwendell@gmail.com> | 2014-03-09 11:57:06 -0700 |
commit | faf4cad1debb76148facc008e0a3308ac96eee7a (patch) | |
tree | 785e068b36a58c7e532a87ed6c8e27f7c4059a53 | |
parent | f6f9d02e85d17da2f742ed0062f1648a9293e73c (diff) | |
download | spark-faf4cad1debb76148facc008e0a3308ac96eee7a.tar.gz spark-faf4cad1debb76148facc008e0a3308ac96eee7a.tar.bz2 spark-faf4cad1debb76148facc008e0a3308ac96eee7a.zip |
Fix markup errors introduced in #33 (SPARK-1189)
These were causing errors on the configuration page.
Author: Patrick Wendell <pwendell@gmail.com>
Closes #111 from pwendell/master and squashes the following commits:
8467a86 [Patrick Wendell] Fix markup errors introduced in #33 (SPARK-1189)
-rw-r--r-- | docs/configuration.md | 6 |
1 files changed, 3 insertions, 3 deletions
diff --git a/docs/configuration.md b/docs/configuration.md index 8f6cb02911..a006224d50 100644 --- a/docs/configuration.md +++ b/docs/configuration.md @@ -147,13 +147,13 @@ Apart from these, the following properties are also available, and may be useful How many stages the Spark UI remembers before garbage collecting. </td> </tr> -</tr> +<tr> <td>spark.ui.filters</td> <td>None</td> <td> Comma separated list of filter class names to apply to the Spark web ui. The filter should be a standard javax servlet Filter. Parameters to each filter can also be specified by setting a - java system property of spark.<class name of filter>.params='param1=value1,param2=value2' + java system property of spark.<class name of filter>.params='param1=value1,param2=value2' (e.g.-Dspark.ui.filters=com.test.filter1 -Dspark.com.test.filter1.params='param1=foo,param2=testing') </td> </tr> @@ -515,7 +515,7 @@ Apart from these, the following properties are also available, and may be useful the whole cluster by default. <br/> <b>Note:</b> this setting needs to be configured in the standalone cluster master, not in individual applications; you can set it through <code>SPARK_JAVA_OPTS</code> in <code>spark-env.sh</code>. -</td> + </td> </tr> <tr> <td>spark.files.overwrite</td> |