diff options
author | Dongjoon Hyun <dongjoon@apache.org> | 2016-12-05 10:36:13 -0800 |
---|---|---|
committer | Shivaram Venkataraman <shivaram@cs.berkeley.edu> | 2016-12-05 10:36:13 -0800 |
commit | 410b7898661f77e748564aaee6a5ab7747ce34ad (patch) | |
tree | a2ba472d676081a4ae419d000709f04d885777c5 | |
parent | eb8dd68132998aa00902dfeb935db1358781e1c1 (diff) | |
download | spark-410b7898661f77e748564aaee6a5ab7747ce34ad.tar.gz spark-410b7898661f77e748564aaee6a5ab7747ce34ad.tar.bz2 spark-410b7898661f77e748564aaee6a5ab7747ce34ad.zip |
[MINOR][DOC] Use SparkR `TRUE` value and add default values for `StructField` in SQL Guide.
## What changes were proposed in this pull request?
In `SQL Programming Guide`, this PR uses `TRUE` instead of `True` in SparkR and adds default values of `nullable` for `StructField` in Scala/Python/R (i.e., "Note: The default value of nullable is true."). In Java API, `nullable` is not optional.
**BEFORE**
* SPARK 2.1.0 RC1
http://people.apache.org/~pwendell/spark-releases/spark-2.1.0-rc1-docs/sql-programming-guide.html#data-types
**AFTER**
* R
<img width="916" alt="screen shot 2016-12-04 at 11 58 19 pm" src="https://cloud.githubusercontent.com/assets/9700541/20877443/abba19a6-ba7d-11e6-8984-afbe00333fb0.png">
* Scala
<img width="914" alt="screen shot 2016-12-04 at 11 57 37 pm" src="https://cloud.githubusercontent.com/assets/9700541/20877433/99ce734a-ba7d-11e6-8bb5-e8619041b09b.png">
* Python
<img width="914" alt="screen shot 2016-12-04 at 11 58 04 pm" src="https://cloud.githubusercontent.com/assets/9700541/20877440/a5c89338-ba7d-11e6-8f92-6c0ae9388d7e.png">
## How was this patch tested?
Manual.
```
cd docs
SKIP_API=1 jekyll build
open _site/index.html
```
Author: Dongjoon Hyun <dongjoon@apache.org>
Closes #16141 from dongjoon-hyun/SPARK-SQL-GUIDE.
-rw-r--r-- | docs/sql-programming-guide.md | 13 |
1 files changed, 8 insertions, 5 deletions
diff --git a/docs/sql-programming-guide.md b/docs/sql-programming-guide.md index c7ad06c639..e59c327915 100644 --- a/docs/sql-programming-guide.md +++ b/docs/sql-programming-guide.md @@ -1851,7 +1851,8 @@ You can access them by doing <td> The value type in Scala of the data type of this field (For example, Int for a StructField with the data type IntegerType) </td> <td> - StructField(<i>name</i>, <i>dataType</i>, <i>nullable</i>) + StructField(<i>name</i>, <i>dataType</i>, [<i>nullable</i>])<br /> + <b>Note:</b> The default value of <i>nullable</i> is <i>true</i>. </td> </tr> </table> @@ -2139,7 +2140,8 @@ from pyspark.sql.types import * <td> The value type in Python of the data type of this field (For example, Int for a StructField with the data type IntegerType) </td> <td> - StructField(<i>name</i>, <i>dataType</i>, <i>nullable</i>) + StructField(<i>name</i>, <i>dataType</i>, [<i>nullable</i>])<br /> + <b>Note:</b> The default value of <i>nullable</i> is <i>True</i>. </td> </tr> </table> @@ -2260,7 +2262,7 @@ from pyspark.sql.types import * <td> vector or list </td> <td> list(type="array", elementType=<i>elementType</i>, containsNull=[<i>containsNull</i>])<br /> - <b>Note:</b> The default value of <i>containsNull</i> is <i>True</i>. + <b>Note:</b> The default value of <i>containsNull</i> is <i>TRUE</i>. </td> </tr> <tr> @@ -2268,7 +2270,7 @@ from pyspark.sql.types import * <td> environment </td> <td> list(type="map", keyType=<i>keyType</i>, valueType=<i>valueType</i>, valueContainsNull=[<i>valueContainsNull</i>])<br /> - <b>Note:</b> The default value of <i>valueContainsNull</i> is <i>True</i>. + <b>Note:</b> The default value of <i>valueContainsNull</i> is <i>TRUE</i>. </td> </tr> <tr> @@ -2285,7 +2287,8 @@ from pyspark.sql.types import * <td> The value type in R of the data type of this field (For example, integer for a StructField with the data type IntegerType) </td> <td> - list(name=<i>name</i>, type=<i>dataType</i>, nullable=<i>nullable</i>) + list(name=<i>name</i>, type=<i>dataType</i>, nullable=[<i>nullable</i>])<br /> + <b>Note:</b> The default value of <i>nullable</i> is <i>TRUE</i>. </td> </tr> </table> |