aboutsummaryrefslogtreecommitdiff
path: root/python/run-tests
diff options
context:
space:
mode:
authorMatei Zaharia <matei@databricks.com>2013-12-29 14:31:45 -0500
committerMatei Zaharia <matei@databricks.com>2013-12-29 14:32:05 -0500
commit615fb649d66b13371927a051d249433d746c5f19 (patch)
tree5a3b3487b46517765d31cdc0f2c2f340c714666d /python/run-tests
parentcd00225db9b90fc845fd1458831bdd9d014d1bb6 (diff)
downloadspark-615fb649d66b13371927a051d249433d746c5f19.tar.gz
spark-615fb649d66b13371927a051d249433d746c5f19.tar.bz2
spark-615fb649d66b13371927a051d249433d746c5f19.zip
Fix some other Python tests due to initializing JVM in a different way
The test in context.py created two different instances of the SparkContext class by copying "globals", so that some tests can have a global "sc" object and others can try initializing their own contexts. This led to two JVM gateways being created since SparkConf also looked at pyspark.context.SparkContext to get the JVM.
Diffstat (limited to 'python/run-tests')
-rwxr-xr-xpython/run-tests1
1 files changed, 1 insertions, 0 deletions
diff --git a/python/run-tests b/python/run-tests
index d4dad672d2..a0898b3c21 100755
--- a/python/run-tests
+++ b/python/run-tests
@@ -35,6 +35,7 @@ function run_test() {
run_test "pyspark/rdd.py"
run_test "pyspark/context.py"
+run_test "pyspark/conf.py"
run_test "-m doctest pyspark/broadcast.py"
run_test "-m doctest pyspark/accumulators.py"
run_test "-m doctest pyspark/serializers.py"