aboutsummaryrefslogtreecommitdiff
path: root/core
diff options
context:
space:
mode:
authorAaron Davidson <aaron@databricks.com>2013-10-08 11:41:52 -0700
committerAaron Davidson <aaron@databricks.com>2013-10-08 11:41:52 -0700
commit749233b869da188920d8d72af7b82e586993d17c (patch)
treeb2470cb5b7c9925b82940b080f35fd58bac7f1e0 /core
parent1cd57cd4d34d5e0603c25a5ad7d28501fe9c94fd (diff)
downloadspark-749233b869da188920d8d72af7b82e586993d17c.tar.gz
spark-749233b869da188920d8d72af7b82e586993d17c.tar.bz2
spark-749233b869da188920d8d72af7b82e586993d17c.zip
Revert change to spark-class
Also adds comment about how to configure for FaultToleranceTest.
Diffstat (limited to 'core')
-rw-r--r--core/src/main/scala/org/apache/spark/deploy/FaultToleranceTest.scala9
1 files changed, 7 insertions, 2 deletions
diff --git a/core/src/main/scala/org/apache/spark/deploy/FaultToleranceTest.scala b/core/src/main/scala/org/apache/spark/deploy/FaultToleranceTest.scala
index 8bac62b860..668032a3a2 100644
--- a/core/src/main/scala/org/apache/spark/deploy/FaultToleranceTest.scala
+++ b/core/src/main/scala/org/apache/spark/deploy/FaultToleranceTest.scala
@@ -36,11 +36,16 @@ import org.apache.spark.deploy.master.RecoveryState
/**
* This suite tests the fault tolerance of the Spark standalone scheduler, mainly the Master.
+ * In order to mimic a real distributed cluster more closely, Docker is used.
* Execute using
* ./spark-class org.apache.spark.deploy.FaultToleranceTest
*
- * In order to mimic a real distributed cluster more closely, Docker is used.
- * Unfortunately, this dependency means that the suite cannot be run automatically without a
+ * Make sure that that the environment includes the following properties in SPARK_DAEMON_JAVA_OPTS:
+ * - spark.deploy.recoveryMode=ZOOKEEPER
+ * - spark.deploy.zookeeper.url=172.17.42.1:2181
+ * Note that 172.17.42.1 is the default docker ip for the host and 2181 is the default ZK port.
+ *
+ * Unfortunately, due to the Docker dependency this suite cannot be run automatically without a
* working installation of Docker. In addition to having Docker, the following are assumed:
* - Docker can run without sudo (see http://docs.docker.io/en/latest/use/basics/)
* - The docker images tagged spark-test-master and spark-test-worker are built from the