aboutsummaryrefslogtreecommitdiff
path: root/python/pyspark/context.py
diff options
context:
space:
mode:
authorSean Owen <sowen@cloudera.com>2016-11-03 17:27:23 -0700
committerReynold Xin <rxin@databricks.com>2016-11-03 17:27:23 -0700
commitdc4c60098641cf64007e2f0e36378f000ad5f6b1 (patch)
treefad72496e3f06613484fdac6c8c13353c79eb838 /python/pyspark/context.py
parentf22954ad49bf5a32c7b6d8487cd38ffe0da904ca (diff)
downloadspark-dc4c60098641cf64007e2f0e36378f000ad5f6b1.tar.gz
spark-dc4c60098641cf64007e2f0e36378f000ad5f6b1.tar.bz2
spark-dc4c60098641cf64007e2f0e36378f000ad5f6b1.zip
[SPARK-18138][DOCS] Document that Java 7, Python 2.6, Scala 2.10, Hadoop < 2.6 are deprecated in Spark 2.1.0
## What changes were proposed in this pull request? Document that Java 7, Python 2.6, Scala 2.10, Hadoop < 2.6 are deprecated in Spark 2.1.0. This does not actually implement any of the change in SPARK-18138, just peppers the documentation with notices about it. ## How was this patch tested? Doc build Author: Sean Owen <sowen@cloudera.com> Closes #15733 from srowen/SPARK-18138.
Diffstat (limited to 'python/pyspark/context.py')
-rw-r--r--python/pyspark/context.py4
1 files changed, 4 insertions, 0 deletions
diff --git a/python/pyspark/context.py b/python/pyspark/context.py
index 1b2e199c39..2c2cf6a373 100644
--- a/python/pyspark/context.py
+++ b/python/pyspark/context.py
@@ -22,6 +22,7 @@ import shutil
import signal
import sys
import threading
+import warnings
from threading import RLock
from tempfile import NamedTemporaryFile
@@ -187,6 +188,9 @@ class SparkContext(object):
self.pythonExec = os.environ.get("PYSPARK_PYTHON", 'python')
self.pythonVer = "%d.%d" % sys.version_info[:2]
+ if sys.version_info < (2, 7):
+ warnings.warn("Support for Python 2.6 is deprecated as of Spark 2.0.0")
+
# Broadcast's __reduce__ method stores Broadcast instances here.
# This allows other code to determine which Broadcast instances have
# been pickled, so it can determine which Java broadcast objects to