aboutsummaryrefslogtreecommitdiff
path: root/python
diff options
context:
space:
mode:
authorAndrew Or <andrewor14@gmail.com>2014-05-16 22:34:38 -0700
committerPatrick Wendell <pwendell@gmail.com>2014-05-16 22:34:38 -0700
commit4b8ec6fcfd7a7ef0857d5b21917183c181301c95 (patch)
treed20ce09d28fac8caf0cec1ef68fbf72a4b3b62a1 /python
parentc0ab85d7320cea90e6331fb03a70349bc804c1b1 (diff)
downloadspark-4b8ec6fcfd7a7ef0857d5b21917183c181301c95.tar.gz
spark-4b8ec6fcfd7a7ef0857d5b21917183c181301c95.tar.bz2
spark-4b8ec6fcfd7a7ef0857d5b21917183c181301c95.zip
[SPARK-1808] Route bin/pyspark through Spark submit
**Problem.** For `bin/pyspark`, there is currently no other way to specify Spark configuration properties other than through `SPARK_JAVA_OPTS` in `conf/spark-env.sh`. However, this mechanism is supposedly deprecated. Instead, it needs to pick up configurations explicitly specified in `conf/spark-defaults.conf`. **Solution.** Have `bin/pyspark` invoke `bin/spark-submit`, like all of its counterparts in Scala land (i.e. `bin/spark-shell`, `bin/run-example`). This has the additional benefit of making the invocation of all the user facing Spark scripts consistent. **Details.** `bin/pyspark` inherently handles two cases: (1) running python applications and (2) running the python shell. For (1), Spark submit already handles running python applications. For cases in which `bin/pyspark` is given a python file, we can simply call pass the file directly to Spark submit and let it handle the rest. For case (2), `bin/pyspark` starts a python process as before, which launches the JVM as a sub-process. The existing code already provides a code path to do this. All we needed to change is to use `bin/spark-submit` instead of `spark-class` to launch the JVM. This requires modifications to Spark submit to handle the pyspark shell as a special case. This has been tested locally (OSX and Windows 7), on a standalone cluster, and on a YARN cluster. Running IPython also works as before, except now it takes in Spark submit arguments too. Author: Andrew Or <andrewor14@gmail.com> Closes #799 from andrewor14/pyspark-submit and squashes the following commits: bf37e36 [Andrew Or] Minor changes 01066fa [Andrew Or] bin/pyspark for Windows c8cb3bf [Andrew Or] Handle perverse app names (with escaped quotes) 1866f85 [Andrew Or] Windows is not cooperating 456d844 [Andrew Or] Guard against shlex hanging if PYSPARK_SUBMIT_ARGS is not set 7eebda8 [Andrew Or] Merge branch 'master' of github.com:apache/spark into pyspark-submit b7ba0d8 [Andrew Or] Address a few comments (minor) 06eb138 [Andrew Or] Use shlex instead of writing our own parser 05879fa [Andrew Or] Merge branch 'master' of github.com:apache/spark into pyspark-submit a823661 [Andrew Or] Fix --die-on-broken-pipe not propagated properly 6fba412 [Andrew Or] Deal with quotes + address various comments fe4c8a7 [Andrew Or] Update --help for bin/pyspark afe47bf [Andrew Or] Fix spark shell f04aaa4 [Andrew Or] Merge branch 'master' of github.com:apache/spark into pyspark-submit a371d26 [Andrew Or] Route bin/pyspark through Spark submit
Diffstat (limited to 'python')
-rw-r--r--python/pyspark/java_gateway.py10
-rw-r--r--python/pyspark/shell.py2
2 files changed, 7 insertions, 5 deletions
diff --git a/python/pyspark/java_gateway.py b/python/pyspark/java_gateway.py
index 3d0936fdca..91ae8263f6 100644
--- a/python/pyspark/java_gateway.py
+++ b/python/pyspark/java_gateway.py
@@ -18,12 +18,12 @@
import os
import sys
import signal
+import shlex
import platform
from subprocess import Popen, PIPE
from threading import Thread
from py4j.java_gateway import java_import, JavaGateway, GatewayClient
-
def launch_gateway():
SPARK_HOME = os.environ["SPARK_HOME"]
@@ -34,9 +34,11 @@ def launch_gateway():
# Launch the Py4j gateway using Spark's run command so that we pick up the
# proper classpath and settings from spark-env.sh
on_windows = platform.system() == "Windows"
- script = "./bin/spark-class.cmd" if on_windows else "./bin/spark-class"
- command = [os.path.join(SPARK_HOME, script), "py4j.GatewayServer",
- "--die-on-broken-pipe", "0"]
+ script = "./bin/spark-submit.cmd" if on_windows else "./bin/spark-submit"
+ submit_args = os.environ.get("PYSPARK_SUBMIT_ARGS")
+ submit_args = submit_args if submit_args is not None else ""
+ submit_args = shlex.split(submit_args)
+ command = [os.path.join(SPARK_HOME, script), "pyspark-shell"] + submit_args
if not on_windows:
# Don't send ctrl-c / SIGINT to the Java gateway:
def preexec_func():
diff --git a/python/pyspark/shell.py b/python/pyspark/shell.py
index d172d588bf..ebd714db7a 100644
--- a/python/pyspark/shell.py
+++ b/python/pyspark/shell.py
@@ -40,7 +40,7 @@ add_files = os.environ.get("ADD_FILES").split(',') if os.environ.get("ADD_FILES"
if os.environ.get("SPARK_EXECUTOR_URI"):
SparkContext.setSystemProperty("spark.executor.uri", os.environ["SPARK_EXECUTOR_URI"])
-sc = SparkContext(os.environ.get("MASTER", "local[*]"), "PySparkShell", pyFiles=add_files)
+sc = SparkContext(appName="PySparkShell", pyFiles=add_files)
print("""Welcome to
____ __