aboutsummaryrefslogtreecommitdiff
path: root/python/pyspark/shell.py
diff options
context:
space:
mode:
authorAaron Davidson <aaron@databricks.com>2013-09-07 09:28:39 -0700
committerAaron Davidson <aaron@databricks.com>2013-09-07 09:34:07 -0700
commit8001687af597056f630fb81f1edbcaf354c5388a (patch)
tree175599cec3e8cbcc3c4cd97ad962bc1408dd9027 /python/pyspark/shell.py
parentb8a0b6ea5ee409dc51e121915794bccce92d457c (diff)
downloadspark-8001687af597056f630fb81f1edbcaf354c5388a.tar.gz
spark-8001687af597056f630fb81f1edbcaf354c5388a.tar.bz2
spark-8001687af597056f630fb81f1edbcaf354c5388a.zip
Remove reflection, hard-code StorageLevels
The sc.StorageLevel -> StorageLevel pathway is a bit janky, but otherwise the shell would have to call a private method of SparkContext. Having StorageLevel available in sc also doesn't seem like the end of the world. There may be a better solution, though. As for creating the StorageLevel object itself, this seems to be the best way in Python 2 for creating singleton, enum-like objects: http://stackoverflow.com/questions/36932/how-can-i-represent-an-enum-in-python
Diffstat (limited to 'python/pyspark/shell.py')
-rw-r--r--python/pyspark/shell.py4
1 files changed, 2 insertions, 2 deletions
diff --git a/python/pyspark/shell.py b/python/pyspark/shell.py
index 9acc176d55..e374ca4ee4 100644
--- a/python/pyspark/shell.py
+++ b/python/pyspark/shell.py
@@ -23,13 +23,13 @@ This file is designed to be launched as a PYTHONSTARTUP script.
import os
import platform
import pyspark
-from pyspark.context import SparkContext, StorageLevelReader
+from pyspark.context import SparkContext
# this is the equivalent of ADD_JARS
add_files = os.environ.get("ADD_FILES").split(',') if os.environ.get("ADD_FILES") != None else None
sc = SparkContext(os.environ.get("MASTER", "local"), "PySparkShell", pyFiles=add_files)
-StorageLevel = StorageLevelReader(sc)
+StorageLevel = sc.StorageLevel # alias StorageLevel to global scope
print """Welcome to
____ __