Simon - I am preplexed, and indeed somewhat confused by your response.

One, because we can.  We can do it by masking a single character. Two, since most of the world uses a decimal system, the decade, as a reference point, just as the century (though less well-defined), have become popularly used reference points. If nine had been the chosen radix for our number system rather than ten (which probably would have been a beter choice) then we would be distinguishing nine year and 90 year intervals instead.

If you have no requirment to represent a decade that's fine, nor do I, but if Ed does - and he is a stakeholder in this process - we should find a way to support this requirement.

I am trying to understand your position on the meaningfullness (or lack) of a date, and take it that you don't think that the representation of days, years, etc. are meaningful (and thus not worth standardizing) unless they are "aligned to common chronological divisions".  I confess that I may not fully understand what you mean but I take it to mean that a year, say, 1984, is not sufficiently defined without a precise start and end point including a time zone and a calendar designator. Please forgive me if I am misrepresenting your position.

What we are trying to do here is build a specification that meets needs which range from simple to complex, with solutions whose complexities are proportional to the complexities of the needs. Well that goes without saying, that's what standards are supposed to do, but my point is, this work started off as an effort on behalf of data entry people who want to do things like enter a year of publication into a form, a year like 1986, and "1986" has quite enough specificity for their needs. Those users represent a large part of the constituency of this effort.  We also want to satisfy more complex scientific requirments if possible but not by sacrificing the simple solutions for simple requirments.

--Ray

Sent: Tuesday, December 21, 2010 4:30 PM
Subject: Re: [DATETIME] precision and decade

This discussion seems rather shaky to me, because it seems to build too much on weak foundations.

I maintain that, at the level at which it is of interest to us, there is no coherent concept of a particular day/week/year/decade/century as an entity in its own right, and that therefore there is no point (and that's on the generous side) to elaborating this.

What does make common sense (and is therefore worth elaborating if possible) is

(a) that an event happened within a given time interval, and for ease of reference (only) the boundaries of that time interval may be aligned to common chronological divisions

(b) that there is a set of events (constrained in some other way, as it is impossible to catalogue all events in any time interval) that happened within a certain time interval -- as (a), for convenience common boundaries are often used. This may be what people have in mind when they refer to a day, week, month, year, decade, century or whatever. I maintain it is the only sense in which those time concepts can be meaningfully reified. Note, however, that those with different calendars will be grouping together different sets of things when they say, for instance, "that was a hard month".

There is nothing universal about any particular time interval. Why prioritise a decade in a decimal calendar over some other interval of years in some other calendar? The closer I consider things, the more arbitrary a decade or a century appears. Hence my preference (and willingness to argue) for a primary, general system of points and intervals, and only secondary and less important (and only if trouble-free) some notations for selecting ranges of years etc. etc.

The precisions mentioned here are fully dependent on the time system and calendar, and therefore only "scientific" inasmuch as our current science uses these units and this numbering system. I do *not* believe they are worthy of standardization.

Simon

Rereading the recent (and not so recent) discussion I'm trying find a way to move this along, particularly the issue of precision. Ed seems to be the one most interested and he said:

> At the heart of things I also don't want us to confuse dates with
> intervals.
> If I say something occurred in the 1960s I don't want to have to use
> intervals
> just as I don't have to use intervals to talk about 12 Sept 1933 (which
> is
> again saying something different from  1933-09-12T00:00Z/1933-09-
> 12T23:59Z).
>
> I suggest we in generally have the following precisions:
> - second
> - minute
> - hour
> - day
> - week
> - month
> - year
> - century

We have the precision Ed seeks for: second, minute, hour, day, week, month, and year.

Which leaves decade and century.  Century is a separate discussion unto itself. I will treat that in a separate thread.

Ed supports the 'x' approach: where we let 196x mean the decade, 1960s.

There are reasonable arguments against this, but I'm willing to go along with it if it will move us forward. We're only talking about this for decade (and possibly century).