Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Differences in how datetime columns are treated with arrow=True #487

Open
theroggy opened this issue Oct 17, 2024 · 12 comments
Open

Differences in how datetime columns are treated with arrow=True #487

theroggy opened this issue Oct 17, 2024 · 12 comments
Labels
question Further information is requested

Comments

@theroggy
Copy link
Member

theroggy commented Oct 17, 2024

When arrow is used, there seem to be several issues/difference with how datetimes are treated compared to how this behaves without arrow. It seems the differences are most of the time when reading.

To test this, I created a PR that shows this behaviour in some added test cases: #486. Some more details can be found in a comment a bit lower in this thread.

Most likely (some of) the issues/behaviours would be best solved/changed upstream in GDAL...

@rouault

@theroggy theroggy added the question Further information is requested label Oct 17, 2024
@theroggy theroggy changed the title Using arrow=True with naive datetimes... Using arrow=True with naive datetimes? Oct 17, 2024
@jorisvandenbossche
Copy link
Member

Do you know it is the writing or the reading that adds the timezone information?

And do you see this happen for different file formats? (it's not GPKG specific?)

@jorisvandenbossche
Copy link
Member

jorisvandenbossche commented Oct 18, 2024

See also OSGeo/gdal#8460, an issue I raised about timezone support earlier. The original issue is fixed, but there is some potentially relevant discussion related to GPKG as well

@theroggy
Copy link
Member Author

theroggy commented Oct 18, 2024

I had a deeper and broader look... and it's a bit complicated, as typically with datetimes and timezones :-). Based on this, it looks like there some issues when reading datetimes with timezones as well. I added some more test cases in the tests in PR #486 .

For datetimes WITHOUT timezone:

This look OK for most file formats: they just stay naive datetimes after being written en read again.

Only for GPKG there is something wrong: the datetime is written fine (naive datetime is kept without changes to the time), but when read the datetime is assumed to be UTC...

  • 2020-01-01T09:00:00.123 becomes 2020-01-01T09:00:00.123+00:00
  • this doesn't seem right, as it gives the impression the time is actually in UTC, so when applications localize the time for visualisation the time will be wrong.

For datetimes WITH timezone:

  • For GPKG, the datetimes are written as they are, but when read they are converted to UTC.
    • E.g. 2020-01-01T09:00:00.123-05:00 becomes 2020-01-01T14:00:00.123+00:00
    • The time stays technically the same, but I don't think it is practical that this happens?
  • For FlatGeoBuffer, the hours stay the same, but the timezone information is just dropped. Not sure if it is when read or written.
    • E.g. 2020-01-01T09:00:00.123-05:00 becomes 2020-01-01T09:00:00.123
    • Timezone info is lost, which doesn't sound ideal. Maybe flatgeobuf doesn't support timezone info?
  • For GeoJSON, the datetimes are written as they are which is OK, but when read the hours are changed and the timezone offset is retained, so you end up with wrong times.
    • E.g. 2020-01-01T09:00:00.123-05:00 becomes 2020-01-01T04:00:00.123-05:00
    • This changes the information quite a lot.

@theroggy theroggy changed the title Using arrow=True with naive datetimes? Issues with arrow=True for datetime columns? Oct 18, 2024
rouault added a commit to rouault/gdal that referenced this issue Oct 19, 2024
@rouault
Copy link

rouault commented Oct 19, 2024

Only for GPKG, the datetime is written fine (naive datetime is kept without changes to the time), but when read the datetime is assumed to be UTC...

Format limitation. Complicated... In theory GPKG only supports UTC datetime. As an extension OGR will support without timezone or other timezones. But Arrow reading makes things complicated as Arrow imposes a common timezone or absence of timezone for a column. And we have no way to know that for a GPKG without reading the content of the entire column. Which we don't do for obvious performance reason. So as a conformant GPKG is assumed to be UTC, we optimisticly lie by reporting a Arrow UTC datetime. Which breaks here. I don't see any way to fix that without doing a full scan, or developing some extension to GPKG to store the single timezone if OGR writes and if all values have the same timezone info

For GPKG, the datetimes are written as they are, but when read they are converted to UTC.

Answered above

For FlatGeoBuffer, the hours stay the same, but the timezone information is just dropped. Not sure if it is when read or written.

In FlatGeoBuf, datetimes are written as ISO8601 strings. But within a column you can have a mix of different timezones, or with/without timezones. On reading with Arrow, similar issue than with GPKG; we must decide for a Arrow column type without having the info. Here we just drop the timezone. I don't see any way to fix that without doing a full scan, or developing some extension to FlatGeoBuf to store the single timezone if OGR writes and if all values have the same timezone info (CC @bjornharrtell)

For GeoJSON, the datetimes are written as they are which is OK, but when read the hours are changed and the timezone offset is retained, so you end up with wrong times.

OK, that one is a true bug. Addressed per OSGeo/gdal#11049
But for my own excuse, I can't find anything in https://arrow.apache.org/docs/format/Columnar.html that explains exactly which numeric value you're supposed to write in a timestamp with timezone column (if it is UTC, which I believe it must be and probably make the most sense, or relative to the timezone, which was what was written)

@bjornharrtell
Copy link

Not sure I fully understand, but so a workaround for FlatGeobuf is to just use/assume UTC?

@rouault
Copy link

rouault commented Oct 19, 2024

Not sure I fully understand, but so a workaround for FlatGeobuf is to just use/assume UTC?

In Arrow, you must indicate the timezone of a datetime column, for the whole column, without timezone, with timezone UTC, with timezone UTC+XXX, etc. The FlatGeoBuf format, when just reading its header, doesn't convey that information. Hence on reading with Arrow, a without timezone Arrow column is reported, and we compute a timestamp value ignoring the timezone (we could have made the decision to write the value as UTC too. Not sure why I didn't do that. Probably because of the confusion I had with how Arrow actually stores timestamps)
If the user would indeed create its FlatGeobuf file with datetimes already in UTC, the impact would likely be less, in that the arrow timestamp value would match the UTC value.

@theroggy
Copy link
Member Author

Indeed complicated.

How about using the first row as heuristic?

  • if no timezone: assume no timezone for everything. If other rows would have a timezone, convert everything to the current local timezone of the machine so the datetimes stay consistent over all rows?
  • if timezone: use this timezone and convert all rows to this timezone if other rows would have other timezones

@theroggy theroggy changed the title Issues with arrow=True for datetime columns? Differences in how datetime columns are treated with arrow=True Oct 19, 2024
@kylebarron
Copy link
Contributor

kylebarron commented Oct 19, 2024

In case it's useful at all, in my (non-GDAL) FlatGeobuf to Arrow reader I intend to take option 1, to assign UTC time zone on the timestamp column and convert hours into UTC (though I do currently see a bug in my code)

@rouault
Copy link

rouault commented Oct 19, 2024

How about using the first row as heuristic?

Such complication would have to be implemented in all drivers that support datetime with potentially non-UTC timezone... At least all those of interest from an Arrow perspective (Perhaps spread the load over all driver maintainers 😂 ?). In some cases there might be performance implication even in just reading one row (thinking of datasets with huge number of layers, where the layer schema is established at opening time).

  • if no timezone: assume no timezone for everything. If other rows would have a timezone, convert everything to the current local timezone of the machine so the datetimes stay consistent over all rows?

I would be nervous about that part. Absence of timezone might mean either "timezone is truly unknown" or "timezone is local time". And even in the latest case, that would mean that any scripting based on that would be dependent on the local timezone of the machine

if timezone: use this timezone and convert all rows to this timezone if other rows would have other timezones

that one would be non controversial

@theroggy
Copy link
Member Author

theroggy commented Oct 19, 2024

How about using the first row as heuristic?

In some cases there might be performance implication even in just reading one row (thinking of datasets with huge number of layers, where the layer schema is established at opening time).

Is this a common case?

  • if no timezone: assume no timezone for everything. If other rows would have a timezone, convert everything to the current local timezone of the machine so the datetimes stay consistent over all rows?

I would be nervous about that part. Absence of timezone might mean either "timezone is truly unknown" or "timezone is local time". And even in the latest case, that would mean that any scripting based on that would be dependent on the local timezone of the machine

I agree it isn't ideal, but I couldn't think of anything better. In general I think it is a pretty bad idea to mix naive and timezoned dates in one column, so at least writing a warning when this is encountered sounds like a good idea.

@jorisvandenbossche
Copy link
Member

jorisvandenbossche commented Oct 19, 2024

And even in the latest case, that would mean that any scripting based on that would be dependent on the local timezone of the machine

Yes, I personally would not like to see anything that is system dependent.

If going with a "first row heuristic", I would rather raise an error if anything later is inconsistent (regarding with vs without timezone), i.e. if you have a mix of tz-aware and tz-naive timestamps.

But for my own excuse, I can't find anything in https://arrow.apache.org/docs/format/Columnar.html that explains exactly which numeric value you're supposed to write in a timestamp with timezone column (if it is UTC, which I believe it must be and probably make the most sense, or relative to the timezone, which was what was written)

Yeah, fully agreed that page is not ideal .. Right now the information about the actual types (not just the physical layout) lives in the Schema.fbs, for this topic it is https://github.com/apache/arrow/blob/54697eaefeb151ad195f3a5812aafa48ba77c45e/format/Schema.fbs#L284-L302
(see apache/arrow#42011 for an issue I opened a while ago about improving that)

@rouault
Copy link

rouault commented Oct 19, 2024

that one would be non controversial

Actually I was almost forgetting that OGRLayer::GetArrowStream() has a TIMEZONE option (cf https://gdal.org/en/latest/doxygen/classOGRLayer.html#a3ffa8511632cbb7cff06a908e6668f55). That you can specify to be UTC (the GeoPackage driver sets it to UTC, if the user hasn't explicitly specified it). So at least for layers where all records have a known timezone, this is the way to get values normalized as UTC.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

5 participants