aboutsummaryrefslogtreecommitdiff
path: root/python/pyspark/sql/types.py
diff options
context:
space:
mode:
author0x0FFF <programmerag@gmail.com>2015-09-01 14:34:59 -0700
committerDavies Liu <davies.liu@gmail.com>2015-09-01 14:34:59 -0700
commitbf550a4b551b6dd18fea3eb3f70497f9a6ad8e6c (patch)
tree24904c89a11ee8420b3998234a54d6b7047dc48f /python/pyspark/sql/types.py
parentec012805337926e56343be2761a1037296446880 (diff)
downloadspark-bf550a4b551b6dd18fea3eb3f70497f9a6ad8e6c.tar.gz
spark-bf550a4b551b6dd18fea3eb3f70497f9a6ad8e6c.tar.bz2
spark-bf550a4b551b6dd18fea3eb3f70497f9a6ad8e6c.zip
[SPARK-10162] [SQL] Fix the timezone omitting for PySpark Dataframe filter function
This PR addresses [SPARK-10162](https://issues.apache.org/jira/browse/SPARK-10162) The issue is with DataFrame filter() function, if datetime.datetime is passed to it: * Timezone information of this datetime is ignored * This datetime is assumed to be in local timezone, which depends on the OS timezone setting Fix includes both code change and regression test. Problem reproduction code on master: ```python import pytz from datetime import datetime from pyspark.sql import * from pyspark.sql.types import * sqc = SQLContext(sc) df = sqc.createDataFrame([], StructType([StructField("dt", TimestampType())])) m1 = pytz.timezone('UTC') m2 = pytz.timezone('Etc/GMT+3') df.filter(df.dt > datetime(2000, 01, 01, tzinfo=m1)).explain() df.filter(df.dt > datetime(2000, 01, 01, tzinfo=m2)).explain() ``` It gives the same timestamp ignoring time zone: ``` >>> df.filter(df.dt > datetime(2000, 01, 01, tzinfo=m1)).explain() Filter (dt#0 > 946713600000000) Scan PhysicalRDD[dt#0] >>> df.filter(df.dt > datetime(2000, 01, 01, tzinfo=m2)).explain() Filter (dt#0 > 946713600000000) Scan PhysicalRDD[dt#0] ``` After the fix: ``` >>> df.filter(df.dt > datetime(2000, 01, 01, tzinfo=m1)).explain() Filter (dt#0 > 946684800000000) Scan PhysicalRDD[dt#0] >>> df.filter(df.dt > datetime(2000, 01, 01, tzinfo=m2)).explain() Filter (dt#0 > 946695600000000) Scan PhysicalRDD[dt#0] ``` PR [8536](https://github.com/apache/spark/pull/8536) was occasionally closed by me dropping the repo Author: 0x0FFF <programmerag@gmail.com> Closes #8555 from 0x0FFF/SPARK-10162.
Diffstat (limited to 'python/pyspark/sql/types.py')
-rw-r--r--python/pyspark/sql/types.py7
1 files changed, 5 insertions, 2 deletions
diff --git a/python/pyspark/sql/types.py b/python/pyspark/sql/types.py
index 94e581a783..f84d08d709 100644
--- a/python/pyspark/sql/types.py
+++ b/python/pyspark/sql/types.py
@@ -1290,8 +1290,11 @@ class DatetimeConverter(object):
def convert(self, obj, gateway_client):
Timestamp = JavaClass("java.sql.Timestamp", gateway_client)
- return Timestamp(int(time.mktime(obj.timetuple())) * 1000 + obj.microsecond // 1000)
-
+ seconds = (calendar.timegm(obj.utctimetuple()) if obj.tzinfo
+ else time.mktime(obj.timetuple()))
+ t = Timestamp(int(seconds) * 1000)
+ t.setNanos(obj.microsecond * 1000)
+ return t
# datetime is a subclass of date, we should register DatetimeConverter first
register_input_converter(DatetimeConverter())