From 3df2d93146a4609c1c4a25b635a898fe5c3be9b6 Mon Sep 17 00:00:00 2001 From: Maurus Cuelenaere Date: Sun, 15 Jan 2017 11:14:50 +0000 Subject: [MINOR][DOC] Document local[*,F] master modes ## What changes were proposed in this pull request? core/src/main/scala/org/apache/spark/SparkContext.scala contains LOCAL_N_FAILURES_REGEX master mode, but this was never documented, so do so. ## How was this patch tested? By using the Github Markdown preview feature. Author: Maurus Cuelenaere Closes #16562 from mcuelenaere/patch-1. --- docs/submitting-applications.md | 2 ++ 1 file changed, 2 insertions(+) (limited to 'docs') diff --git a/docs/submitting-applications.md b/docs/submitting-applications.md index b738194eac..b8b4cc3a53 100644 --- a/docs/submitting-applications.md +++ b/docs/submitting-applications.md @@ -137,7 +137,9 @@ The master URL passed to Spark can be in one of the following formats: Master URLMeaning local Run Spark locally with one worker thread (i.e. no parallelism at all). local[K] Run Spark locally with K worker threads (ideally, set this to the number of cores on your machine). + local[K,F] Run Spark locally with K worker threads and F maxFailures (see spark.task.maxFailures for an explanation of this variable) local[*] Run Spark locally with as many worker threads as logical cores on your machine. + local[*,F] Run Spark locally with as many worker threads as logical cores on your machine and F maxFailures. spark://HOST:PORT Connect to the given Spark standalone cluster master. The port must be whichever one your master is configured to use, which is 7077 by default. -- cgit v1.2.3