From 45d03231d0961677ea0372d36977cecf21ab62d0 Mon Sep 17 00:00:00 2001 From: Andy Konwinski Date: Mon, 8 Oct 2012 10:13:26 -0700 Subject: Adds liquid variables to docs templating system so that they can be used throughout the docs: SPARK_VERSION, SCALA_VERSION, and MESOS_VERSION. To use them, e.g. use {{site.SPARK_VERSION}}. Also removes uses of {{HOME_PATH}} which were being resolved to "" by the templating system anyway. --- docs/contributing-to-spark.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) (limited to 'docs/contributing-to-spark.md') diff --git a/docs/contributing-to-spark.md b/docs/contributing-to-spark.md index a0e645d6cc..c6e01c62d8 100644 --- a/docs/contributing-to-spark.md +++ b/docs/contributing-to-spark.md @@ -12,7 +12,7 @@ The Spark team welcomes contributions in the form of GitHub pull requests. Here * Always import packages using absolute paths (e.g. `scala.collection.Map` instead of `collection.Map`). * No "infix" syntax for methods other than operators. For example, don't write `table containsKey myKey`; replace it with `table.containsKey(myKey)`. - Make sure that your code passes the unit tests. You can run the tests with `sbt/sbt test` in the root directory of Spark. - But first, make sure that you have [configured a spark-env.sh]({{HOME_PATH}}configuration.html) with at least + But first, make sure that you have [configured a spark-env.sh](configuration.html) with at least `SCALA_HOME`, as some of the tests try to spawn subprocesses using this. - Add new unit tests for your code. We use [ScalaTest](http://www.scalatest.org/) for testing. Just add a new Suite in `core/src/test`, or methods to an existing Suite. - If you'd like to report a bug but don't have time to fix it, you can still post it to our [issues page](https://github.com/mesos/spark/issues), or email the [mailing list](http://www.spark-project.org/mailing-lists.html). -- cgit v1.2.3