aboutsummaryrefslogtreecommitdiff
path: root/sql
diff options
context:
space:
mode:
authorSyed Hashmi <shashmi@cloudera.com>2014-06-03 12:04:47 -0700
committerMatei Zaharia <matei@databricks.com>2014-06-03 12:04:47 -0700
commit7782a304ad105ec95cf62cb799e365e5fb385a69 (patch)
tree223ba7bf21bbcb5bb7477d811ef3aa89b3dcd59f /sql
parent862283e9ccace6824880aa4e161723fb3248d438 (diff)
downloadspark-7782a304ad105ec95cf62cb799e365e5fb385a69.tar.gz
spark-7782a304ad105ec95cf62cb799e365e5fb385a69.tar.bz2
spark-7782a304ad105ec95cf62cb799e365e5fb385a69.zip
[SPARK-1942] Stop clearing spark.driver.port in unit tests
stop resetting spark.driver.port in unit tests (scala, java and python). Author: Syed Hashmi <shashmi@cloudera.com> Author: CodingCat <zhunansjtu@gmail.com> Closes #943 from syedhashmi/master and squashes the following commits: 885f210 [Syed Hashmi] Removing unnecessary file (created by mergetool) b8bd4b5 [Syed Hashmi] Merge remote-tracking branch 'upstream/master' b895e59 [Syed Hashmi] Revert "[SPARK-1784] Add a new partitioner" 57b6587 [Syed Hashmi] Revert "[SPARK-1784] Add a balanced partitioner" 1574769 [Syed Hashmi] [SPARK-1942] Stop clearing spark.driver.port in unit tests 4354836 [Syed Hashmi] Revert "SPARK-1686: keep schedule() calling in the main thread" fd36542 [Syed Hashmi] [SPARK-1784] Add a balanced partitioner 6668015 [CodingCat] SPARK-1686: keep schedule() calling in the main thread 4ca94cc [Syed Hashmi] [SPARK-1784] Add a new partitioner
Diffstat (limited to 'sql')
-rw-r--r--sql/hive/src/main/scala/org/apache/spark/sql/hive/TestHive.scala1
1 files changed, 0 insertions, 1 deletions
diff --git a/sql/hive/src/main/scala/org/apache/spark/sql/hive/TestHive.scala b/sql/hive/src/main/scala/org/apache/spark/sql/hive/TestHive.scala
index fa7d010459..041e813598 100644
--- a/sql/hive/src/main/scala/org/apache/spark/sql/hive/TestHive.scala
+++ b/sql/hive/src/main/scala/org/apache/spark/sql/hive/TestHive.scala
@@ -58,7 +58,6 @@ class TestHiveContext(sc: SparkContext) extends LocalHiveContext(sc) {
// By clearing the port we force Spark to pick a new one. This allows us to rerun tests
// without restarting the JVM.
- System.clearProperty("spark.driver.port")
System.clearProperty("spark.hostPort")
override lazy val warehousePath = getTempFilePath("sparkHiveWarehouse").getCanonicalPath