1. 31
    1. 10

      A related useful fact I’ve learned recently:

      Conversion from a Unix timestamp (a number) to date/time in UTC (year-month-day h:m:s) is a pure function, it needn’t think about leap seconds.

      As a corollary, software can use human-readable times in config without depending on OS timezone information.


      1. 11

        Which, incidentally, means that all the documentation which describes UNIX time as “the number of seconds since 01/01/1970 UTC” is wrong. Wikipedia, for example, says that “It measures time by the number of seconds that have elapsed since 00:00:00 UTC on 1 January 1970”, which is incorrect. (Though the POSIX spec it links to seems vague enough to at least not be incorrect; it says the “Seconds since epoch” is “A value that approximates the number of seconds that have elapsed since the Epoch”.)

        I spent a long time trying to figure out the best way to correctly convert between an ISO-8601 timestamp and a UNIX timestamp based on the assumption that UNIX time counted actual seconds since 01/01/1970 UTC, before I found through experimentation that everything I had read and thought I knew about UNIX time was wrong.

        1. 11

          I would fix that Wikipedia article, but you (or the others in the discussion) seem to be better prepared to come up with correct wording, so I most humbly encourage someone to take a whack at it, in the spirit of encouraging people to get involved. Don’t worry, you won’t get reverted. (Probably. I’ll take a look if that happens.)

        2. 8

          Quoting from that article:

          In Unix time, every day contains exactly 86400 seconds but leap seconds are accounted for. Each leap second uses the timestamp of a second that immediately precedes or follows it.

          Well, that’s certainly one way to handle them…

        3. 1

          Yeah, exactly the same story here.

      2. 3

        My favourite versions of these functions are on my blog: broken-down date to day number and day number to broken-down date. Including the time as well as the date (in POSIX or NTP style) is comparatively trivial :-)

    2. 7

      When programmers discuss the problems caused by leap seconds, they usually agree that the solution is to “just” keep time in TAI, and apply leap second adjustments at the outer layers of the system, in a similar manner to time zones.

      Abolishing leap seconds will be helpful for users that have very tight accuracy requirements. UTC is the only timescale that is provided by national laboratories in metrologically traceable manner, i.e. in a way that provides real-time access, and allows users to demonstrate exactly how accurate their timekeeping is.

      TAI is not directly traceable in the same way: it is only a “paper” clock, published monthly in arrears in Circular T as a table of corrections to the various national timescales. (See the explanatory supplement for more details).

      The effect of this is that users who require high-accuracy uniformity and traceability have to implement leap seconds to recover a uniform timescale from UTC - not the other way round as the programmers’ falsehood would have it.

      “Disseminate TAI (along with the current leap offset) and implement the leap seconds at the point of use” might be a “programmers’ falsehood” but it’s also what 3 out of 4 GNSS systems actually do, so it has something going for it.

      The fact that UTC (and not TAI-like timescales) is blessed for dissemination and traceability is downstream of ITU’s request that only UTC should be broadcast; not because of any technical difficulty with disseminating/processing traceable TAI-like timescales:

      • GNSS system times are not representations of UTC, and being broadcast they are not fulfilling requests of ITU, which is recommending only UTC to be broadcast
      • GNSS system times shall be considered as internal technical parameters used only for internal system synchronization, but this is not the case.

      The time that comes out of an atomic clock looks like TAI, adding the leap seconds is something that has to come afterwards, and a system designer gets to choose when. Leap seconds don’t factor into calculations involving the precise timing of moving objects, whether the objects are airplanes, the angles/phases of generators in a power grid, particles in an accelerator, RF signals, etc. Unless you’re OK with grievously anomalous results whenever there’s a leap second, you want a timescale that looks like looks like TAI, not one that looks like UTC. Why bake the leap seconds into your timescale early on if you’re going to need to unbake them every time you calculate with times?

      The designers of GPS, BeiDou, and Galileo all wisely flout this ITU recommendation: their constellations broadcast a TAI-like timescale (alongside the current leap second offset). The designers of PTP also flout this recommendation – by default, the timescale for PTP packets is TAI (usually derived from a GNSS receiver), not UTC. Should you want to reconstitute a UTC time, there is a currentUtcOffset field in PTP timestamps, whose value you can add to a TAI time.

      This “disseminate UTC only” ITU recommendation has been at fundamental odds with real-world PNT systems ever since GPS was known as “NAVSTAR GPS”.

      1. 1

        Except GLONASS does disseminate UTC, including leap seconds.

        If you want your time signal to be broadly useful for celestial navigation, UTC is the way to go as it’s (for now) guaranteed to be within 1s of UT1. I believe that’s where the ITU’s recommendation comes from. That said, it’s probably time for this usage application to take a step back compared to the broader issues caused by leap seconds.

      2. 1

        You aren’t really arguing against what I said, because my “falsehood” did not talk about disseminating TAI along with the UTC offset (that isn’t “just” TAI). Those paragraphs were really an introduction to the next section where I explain that systems can’t avoid working with UTC. And GPS and PTP do not avoid working with UTC: as you said, they tackle its awkwardness head-on.

        The way GPS handles UTC is even more complicated, though. Take a look at the GPS interface specification, in particular IS-GPS-200 section (sic!) where it specifies that UTC is not “just” GPS time plus the leap second offset: there are also parameters A_0 and A_1 that describe a more fine-grained rate and phase adjustment. Section 3.3.4 says more about how GPS time relates to UTC:

        The OCS shall control the GPS time scale to be within one microsecond of UTC (modulo one second). The LNAV/CNAV data contains the requisite data for relating GPS time to UTC. The accuracy of this data during the transmission interval shall be such that it relates GPS time (maintained by the MCS of the CS) to UTC (USNO) within 20 nanoseconds (one sigma).

        This is (basically) related to the fact that atomic clocks need to be adjusted to tick at the same rate as UTC owing to various effects, including special and general relativity, e.g. NIST and the GPS operations centre are a mile high in Colorado, so their clocks tick at a different rate to the USNO in Washington DC, and a different rate from the clocks in the satellites whizzing around in orbit.

        And you are right that UTC is a spectacularly poor design for a reference timescale, hence the effort to get rid of leap seconds.

    3. 1

      Some future time travel researcher is having a hard time predicting where to point their past-camera to.

    4. 1

      I learned that timezones and UTC is not a solved problem and international committees are hoping to converge to a working solution via trial and error in the long term.