From 153cad1293efb7947f5c3d01c7209b5b035e63c6 Mon Sep 17 00:00:00 2001 From: Patrick Wendell Date: Tue, 10 Dec 2013 12:53:45 -0800 Subject: README incorrectly suggests build sources spark-env.sh This is misleading because the build doesn't source that file. IMO it's better to force people to specify build environment variables on the command line always, like we do in every example. --- README.md | 3 --- 1 file changed, 3 deletions(-) diff --git a/README.md b/README.md index 8c7853ea3d..7faba27420 100644 --- a/README.md +++ b/README.md @@ -69,9 +69,6 @@ When building for Hadoop 2.2.X and newer, you'll need to include the additional # Apache Hadoop 2.2.X and newer $ mvn -Dyarn.version=2.2.0 -Dhadoop.version=2.2.0 -Pnew-yarn -For convenience, these variables may also be set through the `conf/spark-env.sh` file -described below. - When developing a Spark application, specify the Hadoop version by adding the "hadoop-client" artifact to your project's dependencies. For example, if you're using Hadoop 1.2.1 and build your application using SBT, add this entry to -- cgit v1.2.3