aboutsummaryrefslogtreecommitdiff
path: root/sql/core/src/test/resources/sql-tests/results/limit.sql.out
diff options
context:
space:
mode:
authorpetermaxlee <petermaxlee@gmail.com>2016-08-10 23:22:14 -0700
committerReynold Xin <rxin@databricks.com>2016-08-10 23:22:14 -0700
commit0db373aaf87991207a7a8a09853b6fa602f0f45b (patch)
treefb921fcaa1108e90ca4c9f0c9044fba923f8e5e5 /sql/core/src/test/resources/sql-tests/results/limit.sql.out
parent7a6a3c3fbcea889ca20beae9d4198df2fe53bd1b (diff)
downloadspark-0db373aaf87991207a7a8a09853b6fa602f0f45b.tar.gz
spark-0db373aaf87991207a7a8a09853b6fa602f0f45b.tar.bz2
spark-0db373aaf87991207a7a8a09853b6fa602f0f45b.zip
[SPARK-17011][SQL] Support testing exceptions in SQLQueryTestSuite
## What changes were proposed in this pull request? This patch adds exception testing to SQLQueryTestSuite. When there is an exception in query execution, the query result contains the the exception class along with the exception message. As part of this, I moved some additional test cases for limit from SQLQuerySuite over to SQLQueryTestSuite. ## How was this patch tested? This is a test harness change. Author: petermaxlee <petermaxlee@gmail.com> Closes #14592 from petermaxlee/SPARK-17011.
Diffstat (limited to 'sql/core/src/test/resources/sql-tests/results/limit.sql.out')
-rw-r--r--sql/core/src/test/resources/sql-tests/results/limit.sql.out83
1 files changed, 83 insertions, 0 deletions
diff --git a/sql/core/src/test/resources/sql-tests/results/limit.sql.out b/sql/core/src/test/resources/sql-tests/results/limit.sql.out
new file mode 100644
index 0000000000..b71b058869
--- /dev/null
+++ b/sql/core/src/test/resources/sql-tests/results/limit.sql.out
@@ -0,0 +1,83 @@
+-- Automatically generated by SQLQueryTestSuite
+-- Number of queries: 9
+
+
+-- !query 0
+select * from testdata limit 2
+-- !query 0 schema
+struct<key:int,value:string>
+-- !query 0 output
+1 1
+2 2
+
+
+-- !query 1
+select * from arraydata limit 2
+-- !query 1 schema
+struct<arraycol:array<int>,nestedarraycol:array<array<int>>>
+-- !query 1 output
+[1,2,3] [[1,2,3]]
+[2,3,4] [[2,3,4]]
+
+
+-- !query 2
+select * from mapdata limit 2
+-- !query 2 schema
+struct<mapcol:map<int,string>>
+-- !query 2 output
+{1:"a1",2:"b1",3:"c1",4:"d1",5:"e1"}
+{1:"a2",2:"b2",3:"c2",4:"d2"}
+
+
+-- !query 3
+select * from testdata limit 2 + 1
+-- !query 3 schema
+struct<key:int,value:string>
+-- !query 3 output
+1 1
+2 2
+3 3
+
+
+-- !query 4
+select * from testdata limit CAST(1 AS int)
+-- !query 4 schema
+struct<key:int,value:string>
+-- !query 4 output
+1 1
+
+
+-- !query 5
+select * from testdata limit -1
+-- !query 5 schema
+struct<>
+-- !query 5 output
+org.apache.spark.sql.AnalysisException
+The limit expression must be equal to or greater than 0, but got -1;
+
+
+-- !query 6
+select * from testdata limit key > 3
+-- !query 6 schema
+struct<>
+-- !query 6 output
+org.apache.spark.sql.AnalysisException
+The limit expression must evaluate to a constant value, but got (testdata.`key` > 3);
+
+
+-- !query 7
+select * from testdata limit true
+-- !query 7 schema
+struct<>
+-- !query 7 output
+org.apache.spark.sql.AnalysisException
+The limit expression must be integer type, but got boolean;
+
+
+-- !query 8
+select * from testdata limit 'a'
+-- !query 8 schema
+struct<>
+-- !query 8 output
+org.apache.spark.sql.AnalysisException
+The limit expression must be integer type, but got string;