aboutsummaryrefslogtreecommitdiff
path: root/sql
diff options
context:
space:
mode:
authorWenchen Fan <wenchen@databricks.com>2015-12-28 11:45:44 -0800
committerMichael Armbrust <michael@databricks.com>2015-12-28 11:45:44 -0800
commit8543997f2daa60dfa0509f149fab207de98145a0 (patch)
treead00a8ffe1fc972a84d98f6913652e233b5e5440 /sql
parentab6bedd85dc29906ac2f175f603ae3b43ab03535 (diff)
downloadspark-8543997f2daa60dfa0509f149fab207de98145a0.tar.gz
spark-8543997f2daa60dfa0509f149fab207de98145a0.tar.bz2
spark-8543997f2daa60dfa0509f149fab207de98145a0.zip
[HOT-FIX] bypass hive test when parse logical plan to json
https://github.com/apache/spark/pull/10311 introduces some rare, non-deterministic flakiness for hive udf tests, see https://github.com/apache/spark/pull/10311#issuecomment-166548851 I can't reproduce it locally, and may need more time to investigate, a quick solution is: bypass hive tests for json serialization. Author: Wenchen Fan <wenchen@databricks.com> Closes #10430 from cloud-fan/hot-fix.
Diffstat (limited to 'sql')
-rw-r--r--sql/core/src/test/scala/org/apache/spark/sql/QueryTest.scala6
1 files changed, 3 insertions, 3 deletions
diff --git a/sql/core/src/test/scala/org/apache/spark/sql/QueryTest.scala b/sql/core/src/test/scala/org/apache/spark/sql/QueryTest.scala
index 9246f55020..442ae79f4f 100644
--- a/sql/core/src/test/scala/org/apache/spark/sql/QueryTest.scala
+++ b/sql/core/src/test/scala/org/apache/spark/sql/QueryTest.scala
@@ -198,6 +198,9 @@ abstract class QueryTest extends PlanTest {
case a: ImperativeAggregate => return
}
+ // bypass hive tests before we fix all corner cases in hive module.
+ if (this.getClass.getName.startsWith("org.apache.spark.sql.hive")) return
+
val jsonString = try {
logicalPlan.toJSON
} catch {
@@ -209,9 +212,6 @@ abstract class QueryTest extends PlanTest {
""".stripMargin, e)
}
- // bypass hive tests before we fix all corner cases in hive module.
- if (this.getClass.getName.startsWith("org.apache.spark.sql.hive")) return
-
// scala function is not serializable to JSON, use null to replace them so that we can compare
// the plans later.
val normalized1 = logicalPlan.transformAllExpressions {