diff options
author | jerryshao <sshao@hortonworks.com> | 2016-09-21 17:57:21 -0400 |
---|---|---|
committer | Andrew Or <andrewor14@gmail.com> | 2016-09-21 17:57:21 -0400 |
commit | 8c3ee2bc42e6320b9341cebdba51a00162c897ea (patch) | |
tree | 16ed50761cb45cd5eb4ca7b31e6b53ee88d98621 /core/src/test/scala | |
parent | 9fcf1c51d518847eda7f5ea71337cfa7def3c45c (diff) | |
download | spark-8c3ee2bc42e6320b9341cebdba51a00162c897ea.tar.gz spark-8c3ee2bc42e6320b9341cebdba51a00162c897ea.tar.bz2 spark-8c3ee2bc42e6320b9341cebdba51a00162c897ea.zip |
[SPARK-17512][CORE] Avoid formatting to python path for yarn and mesos cluster mode
## What changes were proposed in this pull request?
Yarn and mesos cluster mode support remote python path (HDFS/S3 scheme) by their own mechanism, it is not necessary to check and format the python when running on these modes. This is a potential regression compared to 1.6, so here propose to fix it.
## How was this patch tested?
Unit test to verify SparkSubmit arguments, also with local cluster verification. Because of lack of `MiniDFSCluster` support in Spark unit test, there's no integration test added.
Author: jerryshao <sshao@hortonworks.com>
Closes #15137 from jerryshao/SPARK-17512.
Diffstat (limited to 'core/src/test/scala')
-rw-r--r-- | core/src/test/scala/org/apache/spark/deploy/SparkSubmitSuite.scala | 19 |
1 files changed, 19 insertions, 0 deletions
diff --git a/core/src/test/scala/org/apache/spark/deploy/SparkSubmitSuite.scala b/core/src/test/scala/org/apache/spark/deploy/SparkSubmitSuite.scala index 961ece3e00..31c8fb2646 100644 --- a/core/src/test/scala/org/apache/spark/deploy/SparkSubmitSuite.scala +++ b/core/src/test/scala/org/apache/spark/deploy/SparkSubmitSuite.scala @@ -582,6 +582,25 @@ class SparkSubmitSuite val sysProps3 = SparkSubmit.prepareSubmitEnvironment(appArgs3)._3 sysProps3("spark.submit.pyFiles") should be( PythonRunner.formatPaths(Utils.resolveURIs(pyFiles)).mkString(",")) + + // Test remote python files + val f4 = File.createTempFile("test-submit-remote-python-files", "", tmpDir) + val writer4 = new PrintWriter(f4) + val remotePyFiles = "hdfs:///tmp/file1.py,hdfs:///tmp/file2.py" + writer4.println("spark.submit.pyFiles " + remotePyFiles) + writer4.close() + val clArgs4 = Seq( + "--master", "yarn", + "--deploy-mode", "cluster", + "--properties-file", f4.getPath, + "hdfs:///tmp/mister.py" + ) + val appArgs4 = new SparkSubmitArguments(clArgs4) + val sysProps4 = SparkSubmit.prepareSubmitEnvironment(appArgs4)._3 + // Should not format python path for yarn cluster mode + sysProps4("spark.submit.pyFiles") should be( + Utils.resolveURIs(remotePyFiles) + ) } test("user classpath first in driver") { |