diff options
author | Jason Lee <cjlee@us.ibm.com> | 2016-01-27 09:55:10 -0800 |
---|---|---|
committer | Yin Huai <yhuai@databricks.com> | 2016-01-27 09:55:10 -0800 |
commit | edd473751b59b55fa3daede5ed7bc19ea8bd7170 (patch) | |
tree | f05e166a4fd9959182f72a66a15f6118684c9468 /python/pyspark/sql | |
parent | 41f0c85f9be264103c066935e743f59caf0fe268 (diff) | |
download | spark-edd473751b59b55fa3daede5ed7bc19ea8bd7170.tar.gz spark-edd473751b59b55fa3daede5ed7bc19ea8bd7170.tar.bz2 spark-edd473751b59b55fa3daede5ed7bc19ea8bd7170.zip |
[SPARK-10847][SQL][PYSPARK] Pyspark - DataFrame - Optional Metadata with `None` triggers cryptic failure
The error message is now changed from "Do not support type class scala.Tuple2." to "Do not support type class org.json4s.JsonAST$JNull$" to be more informative about what is not supported. Also, StructType metadata now handles JNull correctly, i.e., {'a': None}. test_metadata_null is added to tests.py to show the fix works.
Author: Jason Lee <cjlee@us.ibm.com>
Closes #8969 from jasoncl/SPARK-10847.
Diffstat (limited to 'python/pyspark/sql')
-rw-r--r-- | python/pyspark/sql/tests.py | 7 |
1 files changed, 7 insertions, 0 deletions
diff --git a/python/pyspark/sql/tests.py b/python/pyspark/sql/tests.py index 7593b991a7..410efbafe0 100644 --- a/python/pyspark/sql/tests.py +++ b/python/pyspark/sql/tests.py @@ -747,6 +747,13 @@ class SQLTests(ReusedPySparkTestCase): except ValueError: self.assertEqual(1, 1) + def test_metadata_null(self): + from pyspark.sql.types import StructType, StringType, StructField + schema = StructType([StructField("f1", StringType(), True, None), + StructField("f2", StringType(), True, {'a': None})]) + rdd = self.sc.parallelize([["a", "b"], ["c", "d"]]) + self.sqlCtx.createDataFrame(rdd, schema) + def test_save_and_load(self): df = self.df tmpPath = tempfile.mkdtemp() |