This question already has an answer here:
Closed 2 years ago.
I'm attempting to select a local time (as at Australia/Brisbane) from a timestamptz
field in a PostgreSQL database and noticed that when I use the +10
time zone abbreviation, PostgreSQL appears to subtract 10 hours from the UTC value instead of adding 10 hours.
If I use AEST
as the abbreviation, 10 hours are correctly added.
When I run the following query I would expect both values returned to be the same.
select
('2018-01-01T00:00:00Z'::timestamp with time zone) at time zone 'AEST',
('2018-01-01T00:00:00Z'::timestamp with time zone) at time zone '+10';
However, in my case I'm seeing the following results:
"2018-01-01 10:00:00" | "2017-12-31 14:00:00"
If I run the following query, all rows have a "utc_offset" interval of 10:00:00.
select * from pg_timezone_names
where abbrev in ('+10', 'AEST')
Could anyone offer an explanation as to what is going on here?
That is actually described in some detail in the documentation:
PostgreSQL allows you to specify time zones in three different forms:
A full time zone name, for example America/New_York.
[...]
A time zone abbreviation, for example PST
. [...]
In addition to the timezone names and abbreviations, PostgreSQL will accept POSIX-style time zone specifications of the form STDoffset
or STDoffsetDST
, where STD
is a zone abbreviation, offset
is a numeric offset in hours west from UTC, and DST
is an optional daylight-savings zone abbreviation, assumed to stand for one hour ahead of the given offset. [...]
[...]
Another issue to keep in mind is that in POSIX time zone names, positive offsets are used for locations west of Greenwich. Everywhere else, PostgreSQL follows the ISO-8601 convention that positive timezone offsets are east of Greenwich.
So the POSIX equivalent for AEST
would be -10
or UTC-10
.