aboutsummaryrefslogtreecommitdiff
path: root/sql/core
diff options
context:
space:
mode:
authorBurak Yavuz <brkyvz@gmail.com>2016-09-22 16:50:22 -0700
committerShixiong Zhu <shixiong@databricks.com>2016-09-22 16:50:22 -0700
commita1661968310de35e710e3b6784f63a77c44453fc (patch)
tree03f67311feba426ef53c0b3ca481d7727bbb2539 /sql/core
parentf4f6bd8c9884e3919509907307fda774f56b5ecc (diff)
downloadspark-a1661968310de35e710e3b6784f63a77c44453fc.tar.gz
spark-a1661968310de35e710e3b6784f63a77c44453fc.tar.bz2
spark-a1661968310de35e710e3b6784f63a77c44453fc.zip
[SPARK-17569][SPARK-17569][TEST] Make the unit test added for work again
## What changes were proposed in this pull request? A [PR](https://github.com/apache/spark/commit/a6aade0042d9c065669f46d2dac40ec6ce361e63) was merged concurrently that made the unit test for PR #15122 not test anything anymore. This PR fixes the test. ## How was this patch tested? Changed line https://github.com/apache/spark/blob/0d634875026ccf1eaf984996e9460d7673561f80/sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/FileStreamSource.scala#L137 from `false` to `true` and made sure the unit test failed. Author: Burak Yavuz <brkyvz@gmail.com> Closes #15203 from brkyvz/fix-test.
Diffstat (limited to 'sql/core')
-rw-r--r--sql/core/src/test/scala/org/apache/spark/sql/execution/streaming/FileStreamSourceSuite.scala2
1 files changed, 1 insertions, 1 deletions
diff --git a/sql/core/src/test/scala/org/apache/spark/sql/execution/streaming/FileStreamSourceSuite.scala b/sql/core/src/test/scala/org/apache/spark/sql/execution/streaming/FileStreamSourceSuite.scala
index e8fa6a59c5..0795a0527f 100644
--- a/sql/core/src/test/scala/org/apache/spark/sql/execution/streaming/FileStreamSourceSuite.scala
+++ b/sql/core/src/test/scala/org/apache/spark/sql/execution/streaming/FileStreamSourceSuite.scala
@@ -92,7 +92,7 @@ class FileStreamSourceSuite extends SparkFunSuite with SharedSQLContext {
val dir = new File(temp, "dir") // use non-existent directory to test whether log make the dir
val metadataLog =
new FileStreamSourceLog(FileStreamSourceLog.VERSION, spark, dir.getAbsolutePath)
- assert(metadataLog.add(0, Array(FileEntry(s"$scheme:///file1", 100L))))
+ assert(metadataLog.add(0, Array(FileEntry(s"$scheme:///file1", 100L, 0))))
val newSource = new FileStreamSource(spark, s"$scheme:///", "parquet", StructType(Nil),
dir.getAbsolutePath, Map.empty)