From 4d3a17c8d768a4e76bfb895ce53715434447cb62 Mon Sep 17 00:00:00 2001 From: Andy Konwinski Date: Wed, 12 Sep 2012 16:05:19 -0700 Subject: Fixing lots of broken links. --- docs/configuration.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) (limited to 'docs/configuration.md') diff --git a/docs/configuration.md b/docs/configuration.md index 07190b2931..ab854de386 100644 --- a/docs/configuration.md +++ b/docs/configuration.md @@ -9,7 +9,7 @@ Spark is configured primarily through the `conf/spark-env.sh` script. This scrip Inside this script, you can set several environment variables: * `SCALA_HOME` to point to your Scala installation. -* `MESOS_NATIVE_LIBRARY` if you are [[running on a Mesos cluster|Running Spark on Mesos]]. +* `MESOS_NATIVE_LIBRARY` if you are [running on a Mesos cluster]({{HOME_PATH}}running-on-mesos.html). * `SPARK_MEM` to set the amount of memory used per node (this should be in the same format as the JVM's -Xmx option, e.g. `300m` or `1g`) * `SPARK_JAVA_OPTS` to add JVM options. This includes system properties that you'd like to pass with `-D`. * `SPARK_CLASSPATH` to add elements to Spark's classpath. @@ -21,4 +21,4 @@ The most important thing to set first will probably be the memory (`SPARK_MEM`). ## Logging Configuration -Spark uses [[log4j|http://logging.apache.org/log4j/]] for logging. You can configure it by adding a `log4j.properties` file in the `conf` directory. One way to start is to copy the existing `log4j.properties.template` located there. +Spark uses [log4j](http://logging.apache.org/log4j/) for logging. You can configure it by adding a `log4j.properties` file in the `conf` directory. One way to start is to copy the existing `log4j.properties.template` located there. -- cgit v1.2.3