diff options
author | Michael Armbrust <michael@databricks.com> | 2015-01-12 11:57:59 -0800 |
---|---|---|
committer | Patrick Wendell <pwendell@gmail.com> | 2015-01-12 11:58:21 -0800 |
commit | 5970f0bbc7ff31df3f8d2c6dc0b46cd9f63ebe9a (patch) | |
tree | 814aff2f84a2f126fbdd18fe25da39099b6fdc44 | |
parent | 558be07710cef7a8c2ba1e4237cb6fafdf34981b (diff) | |
download | spark-5970f0bbc7ff31df3f8d2c6dc0b46cd9f63ebe9a.tar.gz spark-5970f0bbc7ff31df3f8d2c6dc0b46cd9f63ebe9a.tar.bz2 spark-5970f0bbc7ff31df3f8d2c6dc0b46cd9f63ebe9a.zip |
[SPARK-5078] Optionally read from SPARK_LOCAL_HOSTNAME
Current spark lets you set the ip address using SPARK_LOCAL_IP, but then this is given to akka after doing a reverse DNS lookup. This makes it difficult to run spark in Docker. You can already change the hostname that is used programmatically, but it would be nice to be able to do this with an environment variable as well.
Author: Michael Armbrust <michael@databricks.com>
Closes #3893 from marmbrus/localHostnameEnv and squashes the following commits:
85045b6 [Michael Armbrust] Optionally read from SPARK_LOCAL_HOSTNAME
(cherry picked from commit a3978f3e156e0ca67e978f1795b238ddd69ff9a6)
Signed-off-by: Patrick Wendell <pwendell@gmail.com>
-rw-r--r-- | core/src/main/scala/org/apache/spark/util/Utils.scala | 2 |
1 files changed, 1 insertions, 1 deletions
diff --git a/core/src/main/scala/org/apache/spark/util/Utils.scala b/core/src/main/scala/org/apache/spark/util/Utils.scala index 4b62423e53..e7160f164a 100644 --- a/core/src/main/scala/org/apache/spark/util/Utils.scala +++ b/core/src/main/scala/org/apache/spark/util/Utils.scala @@ -701,7 +701,7 @@ private[spark] object Utils extends Logging { } } - private var customHostname: Option[String] = None + private var customHostname: Option[String] = sys.env.get("SPARK_LOCAL_HOSTNAME") /** * Allow setting a custom host name because when we run on Mesos we need to use the same |