diff options
author | azagrebin <azagrebin@gmail.com> | 2015-02-16 18:06:19 -0800 |
---|---|---|
committer | Andrew Or <andrew@databricks.com> | 2015-02-16 18:06:58 -0800 |
commit | 16687651f05bde8ff2e2fcef100383168958bf7f (patch) | |
tree | 38171dd79bf81b1cf7a58298e1ef9413e61ff7d7 /python | |
parent | b1bd1dd3228ef50fa7310d466afd834b8cb1f22e (diff) | |
download | spark-16687651f05bde8ff2e2fcef100383168958bf7f.tar.gz spark-16687651f05bde8ff2e2fcef100383168958bf7f.tar.bz2 spark-16687651f05bde8ff2e2fcef100383168958bf7f.zip |
[SPARK-3340] Deprecate ADD_JARS and ADD_FILES
I created a patch that disables the environment variables.
Thereby scala or python shell log a warning message to notify user about the deprecation
with the following message:
scala: "ADD_JARS environment variable is deprecated, use --jar spark submit argument instead"
python: "Warning: ADD_FILES environment variable is deprecated, use --py-files argument instead"
Is it what is expected or the code associated with the variables should be just completely removed?
Should it be somewhere documented?
Author: azagrebin <azagrebin@gmail.com>
Closes #4616 from azagrebin/master and squashes the following commits:
bab1aa9 [azagrebin] [SPARK-3340] Deprecate ADD_JARS and ADD_FILES: minor readability issue
0643895 [azagrebin] [SPARK-3340] Deprecate ADD_JARS and ADD_FILES: add warning messages
42f0107 [azagrebin] [SPARK-3340] Deprecate ADD_JARS and ADD_FILES
Diffstat (limited to 'python')
-rw-r--r-- | python/pyspark/shell.py | 8 |
1 files changed, 5 insertions, 3 deletions
diff --git a/python/pyspark/shell.py b/python/pyspark/shell.py index 89cf76920e..4cf4b89ccf 100644 --- a/python/pyspark/shell.py +++ b/python/pyspark/shell.py @@ -35,9 +35,10 @@ import pyspark from pyspark.context import SparkContext from pyspark.storagelevel import StorageLevel -# this is the equivalent of ADD_JARS -add_files = (os.environ.get("ADD_FILES").split(',') - if os.environ.get("ADD_FILES") is not None else None) +# this is the deprecated equivalent of ADD_JARS +add_files = None +if os.environ.get("ADD_FILES") is not None: + add_files = os.environ.get("ADD_FILES").split(',') if os.environ.get("SPARK_EXECUTOR_URI"): SparkContext.setSystemProperty("spark.executor.uri", os.environ["SPARK_EXECUTOR_URI"]) @@ -59,6 +60,7 @@ print("Using Python version %s (%s, %s)" % ( print("SparkContext available as sc.") if add_files is not None: + print("Warning: ADD_FILES environment variable is deprecated, use --py-files argument instead") print("Adding files: [%s]" % ", ".join(add_files)) # The ./bin/pyspark script stores the old PYTHONSTARTUP value in OLD_PYTHONSTARTUP, |