-00:00 on draft-newman-datetime-00.txt
kre at munnari.OZ.AU
Fri Jan 3 05:26:34 UTC 1997
Date: Thu, 02 Jan 1997 21:46:47 -0500
From: kuhn at cs.purdue.edu ("Markus G. Kuhn")
Message-ID: <199701030246.VAA09620 at ector.cs.purdue.edu>
The 'Z' is not part of any deprecated code, it is the official
ISO 8601 designation for times given in UTC.
I understand that, I was suggesting that it be deprecated, not
that it is.
There are many
applications where you are not interested in local time,
and there, the 'Z' is a nice and short indicator that we use UTC.
I do not like the idea to add a useless and ugly -00 in protocols
where only UTC will be used. But a simple 'Z' to remind the casual
viewer that this is UTC won't do any harm.
If *only* UTC will be used, I don't care, as the zone doesn't need
to be examined of parsed in any case - I'd omit the thing entirely.
It is unlikely that humans (casual viewers) will ever do much
looking at protocols that carry only UTC, as human factors wouldn't
allow a protocol that people will see to only carry UTC times,
people want times they see to be in local time.
You're also assuming that casual observers have some idea what Z
might mean, which I find pretty unlikely.
The lines of code to detect -00 or Z are practically identical,
Yes, when you care to detect this case, they are. The difference
is when you don't care, all you want is to get the time given
converted (from whatever zone it was in) to either UTC or your
own local time. There there is a difference, -00 can be parsed
by atoi() (or whatever) just the same as +00 without any special
cases at all. On the other hand, "Z" requires a magic special
test for that case first.
and a missing local time indicator is in any case a special case
that has to be handeled.
Not at all, as long as the time and zone together are consistent
and after the two are combined, the correct UTC time can be deduced,
nothing else often matters. The really hard case is the "this time
is my local time, but I have no idea what the offset from UTC is".
That's not what this discussion is about.
We go even one step further and remove the redundant minute offset digits.
The minutes offset is certainly not redundant. I find it hard to
believe that anyone who knows anything about time zones can believe
that. In Australia right now there are two different time zones
that are not even hours. +1030 and +0930. There are two only
because the former (further south) has summer time (it isn't
daylight saving time in Aust, it is Summer Time) and the northern
section (which gets close to the equator, and well into the tropics)
There are also half hour offsets in India, and other places.
RFC822 has a relatively bad date/time format design.
Actually, ignoring the alpha timezone nonsense (both the 3 letter
versions, and the one letter ones) I think it is fairly reasonable.
Including weekday indicators
They are optional, harmless (you never need to parse them), and
generally useful - I don't know about you, but right now I can't
think just what day 29 Dec 1996 was, I know it was a few days ago,
but just when it was, or what I did that day, I have no idea. But
when I look at my calendar and see it was last Sunday, then I have
a much better idea just what that was (it was the day I didn't
get to go to the cricket because the game ended early, the previous
You can argue that the UA should be calculating and displaying the
weekday (of desired) based upon the rest of the date, but UA's
of the time (and even now) were not nearly that complex, and
tended to simply show the text as it was transmitted.
and 3-letter month names that have to be processed by a lookup table
You actually want to be thankful that they did it that way, if
not, what we would have had to deal with in this message would
1/3/97 (or 1/3/1997)
which I think you will agree with me means March 1 this year, but
isn't what the people in the US believe it means...
That might not need a lookup table, but it is ambiguous, and
much much worse. You certainly wouldn't have got 19970103
as that's horrid for humans to parse, and unknown in the US
(and generally here as well, if that matters).
simply demonstrates that the RFC822 designers were careless here.
I think you are wrong about that, this part I think they did as
best that could be done - for that matter, so did they in most other
RFC822 is anyway a very strange and difficult to read standard.
That's a different issue, and I mostly agree. It can be difficult
to decipher. But once you do, you generally find that the actual
spec is fairly reasonable (of course it helps if you understand the
constraints that operated at the time - 822 wasn't written in a
vacuum, 733 existed before it, etc).
We are talking about an ASCII date/time format to be used in NEW
protocol designs where there is no requirement for backward compatibility
I claim that the RFC822 date format has been used in new
protocols (e.g., HTTP), not because it is such a great design, but because
it was easy to reference.
I only partly agree. It has also been fairly widely used because
it is generally better to have just one way of doing things than
several. That's why I doubt that a new form will be widely used.
Applications could simply use unix's "ctime" format, which would
be much easier for many to generate - they don't, not because
there is anything particularly wrong with that, not even because it
is an OS specific format (it isn't, it was just invented there)
but because it is different, and there is no compelling reason to
do something different when a way to do the same function already
More information about the tz