diff options
author | Peter Ableda <peter.ableda@cloudera.com> | 2016-12-06 10:12:27 -0800 |
---|---|---|
committer | Marcelo Vanzin <vanzin@cloudera.com> | 2016-12-06 10:12:27 -0800 |
commit | 05d416ffc616930d05b59bce0ca6cbd682f8b5bc (patch) | |
tree | 087f5faa1e3b8f25c5e6a0ac4f67d186533a173b | |
parent | 381ef4ea76b0920e05c81adb44b1fef88bee5d25 (diff) | |
download | spark-05d416ffc616930d05b59bce0ca6cbd682f8b5bc.tar.gz spark-05d416ffc616930d05b59bce0ca6cbd682f8b5bc.tar.bz2 spark-05d416ffc616930d05b59bce0ca6cbd682f8b5bc.zip |
[SPARK-18740] Log spark.app.name in driver logs
## What changes were proposed in this pull request?
Added simple logInfo line to print out the `spark.app.name` in the driver logs
## How was this patch tested?
Spark was built and tested with SparkPi app. Example log:
```
16/12/06 05:49:50 INFO spark.SparkContext: Running Spark version 2.0.0
16/12/06 05:49:52 INFO spark.SparkContext: Submitted application: Spark Pi
16/12/06 05:49:52 INFO spark.SecurityManager: Changing view acls to: root
16/12/06 05:49:52 INFO spark.SecurityManager: Changing modify acls to: root
```
Author: Peter Ableda <peter.ableda@cloudera.com>
Closes #16172 from peterableda/feature/print_appname.
-rw-r--r-- | core/src/main/scala/org/apache/spark/SparkContext.scala | 3 |
1 files changed, 3 insertions, 0 deletions
diff --git a/core/src/main/scala/org/apache/spark/SparkContext.scala b/core/src/main/scala/org/apache/spark/SparkContext.scala index b8414b5d09..be4dae19df 100644 --- a/core/src/main/scala/org/apache/spark/SparkContext.scala +++ b/core/src/main/scala/org/apache/spark/SparkContext.scala @@ -382,6 +382,9 @@ class SparkContext(config: SparkConf) extends Logging { throw new SparkException("An application name must be set in your configuration") } + // log out spark.app.name in the Spark driver logs + logInfo(s"Submitted application: $appName") + // System property spark.yarn.app.id must be set if user code ran by AM on a YARN cluster if (master == "yarn" && deployMode == "cluster" && !_conf.contains("spark.yarn.app.id")) { throw new SparkException("Detected yarn cluster mode, but isn't running on a cluster. " + |