diff options
author | Joseph Gonzalez <joseph.e.gonzalez@gmail.com> | 2015-07-14 00:32:29 -0700 |
---|---|---|
committer | Reynold Xin <rxin@databricks.com> | 2015-07-14 00:32:29 -0700 |
commit | 20c1434a8dbb25b98f6b434b158ae88e44ce3057 (patch) | |
tree | 3fe7b7ceb5f7d80995c2f968c8fd6ee444df33c2 /launcher/src | |
parent | 408b384de96b9dbe94945753f7947fbe84272ae1 (diff) | |
download | spark-20c1434a8dbb25b98f6b434b158ae88e44ce3057.tar.gz spark-20c1434a8dbb25b98f6b434b158ae88e44ce3057.tar.bz2 spark-20c1434a8dbb25b98f6b434b158ae88e44ce3057.zip |
[SPARK-9001] Fixing errors in javadocs that lead to failed build/sbt doc
These are minor corrections in the documentation of several classes that are preventing:
```bash
build/sbt publish-local
```
I believe this might be an issue associated with running JDK8 as ankurdave does not appear to have this issue in JDK7.
Author: Joseph Gonzalez <joseph.e.gonzalez@gmail.com>
Closes #7354 from jegonzal/FixingJavadocErrors and squashes the following commits:
6664b7e [Joseph Gonzalez] making requested changes
2e16d89 [Joseph Gonzalez] Fixing errors in javadocs that prevents build/sbt publish-local from completing.
Diffstat (limited to 'launcher/src')
-rw-r--r-- | launcher/src/main/java/org/apache/spark/launcher/SparkLauncher.java | 5 | ||||
-rw-r--r-- | launcher/src/main/java/org/apache/spark/launcher/package-info.java | 10 |
2 files changed, 10 insertions, 5 deletions
diff --git a/launcher/src/main/java/org/apache/spark/launcher/SparkLauncher.java b/launcher/src/main/java/org/apache/spark/launcher/SparkLauncher.java index d4cfeacb6e..c0f89c9230 100644 --- a/launcher/src/main/java/org/apache/spark/launcher/SparkLauncher.java +++ b/launcher/src/main/java/org/apache/spark/launcher/SparkLauncher.java @@ -25,11 +25,12 @@ import java.util.Map; import static org.apache.spark.launcher.CommandBuilderUtils.*; -/** +/** * Launcher for Spark applications. - * <p/> + * <p> * Use this class to start Spark applications programmatically. The class uses a builder pattern * to allow clients to configure the Spark application and launch it as a child process. + * </p> */ public class SparkLauncher { diff --git a/launcher/src/main/java/org/apache/spark/launcher/package-info.java b/launcher/src/main/java/org/apache/spark/launcher/package-info.java index 7ed756f4b8..7c97dba511 100644 --- a/launcher/src/main/java/org/apache/spark/launcher/package-info.java +++ b/launcher/src/main/java/org/apache/spark/launcher/package-info.java @@ -17,13 +17,17 @@ /** * Library for launching Spark applications. - * <p/> + * + * <p> * This library allows applications to launch Spark programmatically. There's only one entry * point to the library - the {@link org.apache.spark.launcher.SparkLauncher} class. - * <p/> + * </p> + * + * <p> * To launch a Spark application, just instantiate a {@link org.apache.spark.launcher.SparkLauncher} * and configure the application to run. For example: - * + * </p> + * * <pre> * {@code * import org.apache.spark.launcher.SparkLauncher; |