Back-of-the-envelope cost of extra data :-)

Robert Elz kre at munnari.OZ.AU
Wed May 4 20:25:14 UTC 2005


    Date:        Wed, 4 May 2005 12:09:51 -0400 
    From:        "Olson, Arthur David (NIH/NCI)" <olsona at dc37a.nci.nih.gov>
    Message-ID:  <75DDD376F2B6B546B722398AC161106C7403FA at nihexchange2.nih.gov>

  | Maximum total cost: 650 million computers * three twentieths of a cent:
  | $975,000 (ulp!)

Why do we need to spend that much?   I'm confused.   At first I thought the
400 years in advance stuff was as far as we wanted to predict he future
(and shouldn't the code be generating "stardate 1603.8" by then anyway?)

But then I saw some reference to what happens after the 400 years, where
the rules are to be condensed into an algorithm (kind of like the posix
string) and then any future time constructed from that.    The conversion 
generated is fantasy, we all know that, but that isn't the point of this mail.

If at some future point the code is going to switch from the tables, to an
algorithmic conversion, why bloat the tables by including so much speculative
future data?   We know that none of us is able to predict the rules with any
accuracy more than a year or two in advance in the best case - for many, we
can't even keep in advance of the changes.

If that algorithmic conversion is to happen, why not have it kick in much 
closer to now - say 20 years in advance, instead of 400 (and yes, even
earlier than the old format zone files ran out of table data).   The 
conversions all a combination of smoke, mirrors, and superstition out there
anyway.

If the rules change (or rather, when the rules change) people need new
zone files anyway, if they don't change, there's no reason the algorithmic
conversion can't keep on being used, way out into the time when it is
being used for current & even past time conversions (it may be slower,
but anyone who cares can just update their zone tables).

kre




More information about the tz mailing list