[tz] McMurdo/South Pole
scolebourne at joda.org
Sun Sep 22 12:17:35 UTC 2013
Guy/Paul/Russ, I thought I'd explained it pretty clearly why recent
this change is problematic. All the replies miss the point.
The tzdb is not some kind of theoretical project, it is used directly
as input data by millions of developers. Those developers have never
previously has to cross-reference tzdb data with any other data (ie.
when somewhere was inhabited) to get a reasonable answer to the
question "what is local time in 1930" (reasonable not accurate).
Describing the input as malformed is unhelpful to the debate, because
a developer just using the data has no idea from the tzdb that this
input *is* malformed.
On 22 September 2013 08:43, Guy Harris <guy at alum.mit.edu> wrote:
> If you care about getting the *right* answer, the only concerns should be about cases where the current pre-1970 data is known to be correct and the changes will eliminate correct data. Discarding data not known to be correct could just replace one incorrect answer with another.
> If you only care about getting *an* answer, it obviously doesn't matter.
I don't just want *an* answer, and I'm not obsessed by the *right*
answer, my problem here is a *wrong* answer (any clock change
including DST before 1956 is wrong). Such wrong data implies human
activity when there was none. There are a variety of answers I would
accept before 1956, including LMT and UTC. But changes in offset I
cannot. My problem more generally is tinkering, making changes from
one guesswork answer to another guesswork answer,
Note that if you read the above properly, there is a solution here. It
is absolutely fine to say that 1930 is bad input data for McMurdo. But
it is absolutely essential for the tzdb to provide that data.
What is completely objectionable is to say that we need some other
source of data to find that out. Why? Firstly, because the tzdb data
has previously been complete within itself, now it is not. Secondly,
because there is no such data source that maps TZDB to
habitation/accuracy dates (since 1800).
On 22 September 2013 07:40, Paul Eggert <eggert at cs.ucla.edu> wrote:
> More generally, the tz database isn't designed to answer questions
> about which parts of the Earth were inhabited when, and it's
> implausible that actual users would use it that way
I'm not asking for that!!! I'm simply asking the tzdb to provide
reasonable local time information since the start of global offset
fixing for the IDs it provides, keeping the data stable if entries are
guesswork. From my perspective as a data consumer, that is what the
data has always provided.
I do wish there was a little more acceptance of how the data in tzdb
is actually being used. I'm your customer, and I, and those downstream
of me, only see the data, not the
rationale/discussion/justification/theory made here. Focus solely on
the data visible downstream, and my problem should be obvious.
On 22 September 2013 06:51, Russ Allbery <rra at stanford.edu> wrote:
> Stephen Colebourne <scolebourne at joda.org> writes:
>> We would not and should not create an ID for an uninhabited location,
>> but where somewhere is or was inhabited we should make best efforts to
>> define accurate data. The new McMurdo data is clearly not accurate prior
>> to 1956.
> There is no such thing as local time in McMurdo prior to 1956. There is
> no standard for accuracy; the entire concept of accuracy of such a thing
> is meaningless. Local time is not a physical property. It's something
> created by humans who make shared rules about how to set their clocks, and
> in the absence of human presence, it doesn't exist. Local time in McMurdo
> prior to its habitation is undefined.
> To use a Java analogy, you're doing the equivalent of complaining that
> finalize() isn't running at the point in your program where you expected
> it to and where it ran in a previous release of the JVM. You're getting
> about as much sympathy here as you'd get with that plea in a Java
> As with any situation with undefined inputs, the output is basically at
> the discretion of the software, and returning either an error or some
> reasonably convenient answer are both standard approaches. Personally, I
> like the idea of returning an error, since I don't like undefined inputs
> resulting in apparently accurate outputs with no error. But,
> historically, the code has always returned some arbitrary but vaguely
> reasonable response (usually either a blind backwards-projection of
> current rules or whatever was the prevailing time standard in some
> reasonably nearby location) instead of producing an error, and there's a
> backwards compatibility challenge with changing that behavior to produce
>> The key problem with the change for data consumers is the fact that
>> McMurdo was uninhabited in the 1930s is *external* information, that an
>> application would now need to *separately* know in order to get the
>> correct result for McMurdo.
> There's no such thing as a correct result for McMurdo in the 1930s because
> the question is not well-formed. The application cannot get something
> that doesn't exist.
>> The problem I have is that I'm no longer sure I can trust tzdb to safely
>> be the guardian of the limited pre-1970 data which it has always
>> possessed and which Java has long used. I will be talking to Oracle
>> people this week to discuss what options we have for Java probably
>> requiring manual workarounds of the damaged data. <shakes head
>> in despair>
> I once again encourage you to start your own separate project. I think
> that would make quite a few people much happier, including you.
> Russ Allbery (rra at stanford.edu) <http://www.eyrie.org/~eagle/>
More information about the tz