aboutsummaryrefslogtreecommitdiff
path: root/python
diff options
context:
space:
mode:
authorSami Jaktholm <sjakthol@outlook.com>2016-09-14 09:38:30 +0100
committerSean Owen <sowen@cloudera.com>2016-09-14 09:38:30 +0100
commitb5bfcddbfbc2e79d3d0fbd43942716946e6c4ba3 (patch)
tree82c4e6e0608ee6ab76da113d85642df421396d38 /python
parentdef7c265f539f3e119f068b6e9050300d05b14a4 (diff)
downloadspark-b5bfcddbfbc2e79d3d0fbd43942716946e6c4ba3.tar.gz
spark-b5bfcddbfbc2e79d3d0fbd43942716946e6c4ba3.tar.bz2
spark-b5bfcddbfbc2e79d3d0fbd43942716946e6c4ba3.zip
[SPARK-17525][PYTHON] Remove SparkContext.clearFiles() from the PySpark API as it was removed from the Scala API prior to Spark 2.0.0
## What changes were proposed in this pull request? This pull request removes the SparkContext.clearFiles() method from the PySpark API as the method was removed from the Scala API in 8ce645d4eeda203cf5e100c4bdba2d71edd44e6a. Using that method in PySpark leads to an exception as PySpark tries to call the non-existent method on the JVM side. ## How was this patch tested? Existing tests (though none of them tested this particular method). Author: Sami Jaktholm <sjakthol@outlook.com> Closes #15081 from sjakthol/pyspark-sc-clearfiles.
Diffstat (limited to 'python')
-rw-r--r--python/pyspark/context.py8
1 files changed, 0 insertions, 8 deletions
diff --git a/python/pyspark/context.py b/python/pyspark/context.py
index 6e9f24ef10..2744bb9ec0 100644
--- a/python/pyspark/context.py
+++ b/python/pyspark/context.py
@@ -787,14 +787,6 @@ class SparkContext(object):
"""
self._jsc.sc().addFile(path)
- def clearFiles(self):
- """
- Clear the job's list of files added by L{addFile} or L{addPyFile} so
- that they do not get downloaded to any new nodes.
- """
- # TODO: remove added .py or .zip files from the PYTHONPATH?
- self._jsc.sc().clearFiles()
-
def addPyFile(self, path):
"""
Add a .py or .zip dependency for all tasks to be executed on this