aka “It’s time to make programming easier by changing reality”
I feel like, in this case, we could also make programming easier by changing programming. The root cause of this isn’t leap seconds per se, but the fact that the de-facto-standard computer timekeeping system doesn’t understand them, and we hacked it up in such a way that they completely break everything.
If UNIX time counted actual, rather than idealized, seconds, most things would become easier. (That is, for each tick of a naïve clock, the current UTC second is labelled with the numerically next integer). Converting the current time without current leap second data would be wrong. But clocks don’t need to care about this, only frontend systems do, and in 2022 those run lots of things that need to be updated more frequently than every six months.
The article doesn’t bring it up, but the problems with leap seconds don’t end with better programming. Since the leap seconds are only announced about 6 months in advance, the number of seconds between now and a UTC timestamp more than 6 months into the future is unknown. Therefore, if Unix time counted actual seconds as you suggest, it would be impossible to convert UTC to Unix time for such future timestamps. That would mean that calendars and other application that need to represent future timestamps couldn’t use Unix time.
As I see it, the root cause of this problem is that civil time, when used to express plans more than 6 months into the future, is undefined. Better programming can’t fix that.
The date and time of an event in several years can’t be defined in terms of seconds from now, but you can easily define it in terms of date, time and timezone.
You cannot easily define the time and date of an event in the future in terms of date, time, and timezone! This has nothing to do with UNIX timestamps being stored in seconds.
If you care about the elapsed time, you cannot count the amount of actual time that will pass from now to some date even a year from now. Not with precision at the level of seconds or less. X amount of time into the future doesn’t map to a fixed date, time, and timezone because we’re redefining time constantly with these leap seconds.
The definition of UTC-5, modulo leap seconds, doesn’t change. In that sense removing leap seconds does allow you to compute future times just fine. If I have a device that I need to check 5 years from now, I know exactly what time that will be UTC-5, modulo leap seconds.
Now if you mean, timezone in the sense of EST/EDT, then plenty of time zones have not changed in well over a century and it’s hard to see them ever changing. Perhaps ET may change by fixing it to EST or EDT, but generally, as countries become more developed they stop making these changes because of the disruption to the economy. Check out https://data.iana.org/time-zones/tzdb/NEWS
So yes, political control of timezones is actually being killed as the economic consequences of changing them becomes severe. Things are slowly freezing into place, aside from leap seconds.
Basically, “18:30 on 2038-01-19 in the host system timezone” is the only more or less well-defined concept of a future date that is useful in practice. When that time comes, a system that is up to date with all natural and political changes can correctly detect that it came.
Applications that deal with arbitrary time intervals in the future like “2.34 * 10^9 seconds from now” should use a monotonic scale like TAI anyway, they don’t need UTC.
Eh, scheduling meetings a year or two in advance can happen, and it could be well defined and useful. But it’s important to note that the further into the future something is happening, the less the accuracy matters, unless it’s astronomy at which point you have TAI or UT1 depending on context.
A GPS reciever costs $1000 at most. If you need precise accuracy, it’s what you’re going to use, and it’s just GPS_time - 7s to get to TAI. Big companies run their own NTP pools for reliability, and if you have your own pool, you can run it at TAI.
Can you elaborate on this, I’m really curious why is it so? I was under the impression that if we say a meeting will happen on August 1st, 2050, at 3.30pm CEST, in Bern, Switzerland, not many things can make this ambiguous. If Switzerland stops using CEST, I’ll probably just switch to the replacement timezone. The reason I’m confused is that I don’t see how leap seconds play any role.
As I see it, the root cause of this problem is that civil time, when used to express plans more than 6 months into the future, is undefined.
Civil time is not “undefined”. Definitions of local civil time for various locations may change, but that’s not the same thing at all as “undefined”.
I also don’t generally agree with “better programming can’t fix” – the issue simply is programmers demanding that a messy human construct stop being messy and instead become perfectly neat and regular, since we can’t possibly cope with the complexity otherwise. You slip into this yourself: you assume that the only useful, perhaps the only possible, representation of a future date/time is an integer number of seconds that can be added to the present Unix timestamp. The tyranny of the Unix timestamp is the problem here, and trying to re-orient all human behavior to make Unix timestamps easier to use for this purpose is always going to be a losing proposition.
As I see it, the root cause of this problem is that civil time, when used to express plans more than 6 months into the future, is undefined. Better programming can’t fix that.
This is true to an extent, but I think it’s true independently of leap seconds. The timezone, and even the calendar, that will apply to dates in the future are also undefined.
I also think it’s not the whole story. It seems intuitively reasonable to me that “the moment an exact amount of real time from this other moment” is a different type from “the moment a past/future clock read/reads this time”, and that knowledge from the past or future is required to convert between the two. I think we’ve been taking a huge shortcut by using one representation for these two things, and that we’d probably be better off, regardless of the leap second debate, being clear which one we mean in any given instance.
The proposed solution is to just push off the complexity of accurate solar timekeeping until the next millennium, which could possibly make a Y3K bug for anyone tracking accurate solar time that didn’t adopt the proposal. This is a civil and social problem that could be addressed with standards, treaties, and diplomacy - then accurately modeled in software when the human element is figured out. I’m no expert in timekeeping but Facebook’s history has made it clear that you can’t solve civil and social problems with code alone.
It seems likely to me that human civilization as we know it will be unrecognizable in less than 100 years. Take your pick: climate change, resource wars, genetically engineered plagues, robot uprising, uploading, or (in the best case) a smooth transition to a post-scarcity society.
I vote for not worrying about solar timekeeping, and if it can be put off to Y3K, that’s fine.
We already have a time base without leap seconds; it’s called UT1. Use that if you want that. What Facebook is really saying is they want civil clock to track UT1 instead of UTC. But this essay doesn’t address any of the reasons why there are leap seconds, what consequences there would be to let solar noon drift from clock noon.
UT1 still wanders around as the earth’s rotation slows down or speeds up. TAI is what they want. I think that’s what we should be using too. But Unix time is an abomination that got ubiquitous, and now people have to live with it. But honestly, I think Facebook could gain a lot by transitioning to TAI, and converting to UTC at the outside boundary when needed.
Here’s an old comment of mine from a different post:
Time is just fucked on so many different levels:
Philosophical: What is time? We just don’t know.
Physical: Turns out there is no such thing as simultaneity, and time flows differently at different locations. Time may be discrete at the Planck level, but we don’t really know yet.
Cosmological: The Earth does not rotate at a constant speed, the Earth does not orbit the Sun at a fractional component of its rotation, and the Moon does not orbit at even ratio either.
Historical: Humans have not used time or calendars consistently.
Notational: Some time notations are ambiguous (e.g. during daylight savings transitions) and others are skipped.
Regional: Different regions use subtly different clocks and calendars.
Political: Different political actors choose to change time whenever they feel like it with little or no warning.
The pseudocode (which looks like Go) in the article is explicitly not what the Go time package does. In fact, the section on monotonic clocks has the exact same example:
For example, this code always computes a positive elapsed time of approximately 20 milliseconds, even if the wall clock is changed during the operation being timed:
So this caused an outage in Reddit but I’ve never heard of outages caused by leap seconds in high-frequency trading. How do they deal with this? I had a friend once tell me that FPGA and advances in HFT were literally pushing up against some physical boundaries of how fast light can pass through materials (I don’t quite understand enough about material science to know how true this is). So I would think if anything needed accurate timing it would be HFT platforms.
Trading usually stops before midnight. Moreover, trading is often stopped when there is a decent chance that something might misbehave, for example trading was effectively stopped for a stock when it’s priced reached close to the integer limit.
Today I learned if you are “smearing on public services, [you] can’t join the public pools.”
More seriously, there is a pretty big leap right here in the last sentence: “ which we believe will be enough for the next millennium.” I haven’t done the math but I have to believe that ice is going to continue to melt pretty fast for the next 100 years and that will slow us down. Even forgetting global warming, 1000 years is a ridiculously long time to be extrapolating.
As long as everyone agrees on a fixed standard, the exact to-within-a-second time of sunrise, sunset, midday or midnight in terms of where celestial bodies are really only matters to astronomers, satellite operators and… I dunno, maybe some birds or something. The difference in sunset/sunrise time from one side of a timezone to another is already way bigger than that, so the aforementioned astronomers and satellite operators already have correct real time based on their exact longitude anyway. Just use an atomic clock reference time and let display software show offsets from it – if the offsets ever get big enough for most systems to actually care.
aka “It’s time to make programming easier by changing reality”
I feel like, in this case, we could also make programming easier by changing programming. The root cause of this isn’t leap seconds per se, but the fact that the de-facto-standard computer timekeeping system doesn’t understand them, and we hacked it up in such a way that they completely break everything.
If UNIX time counted actual, rather than idealized, seconds, most things would become easier. (That is, for each tick of a naïve clock, the current UTC second is labelled with the numerically next integer). Converting the current time without current leap second data would be wrong. But clocks don’t need to care about this, only frontend systems do, and in 2022 those run lots of things that need to be updated more frequently than every six months.
The article doesn’t bring it up, but the problems with leap seconds don’t end with better programming. Since the leap seconds are only announced about 6 months in advance, the number of seconds between now and a UTC timestamp more than 6 months into the future is unknown. Therefore, if Unix time counted actual seconds as you suggest, it would be impossible to convert UTC to Unix time for such future timestamps. That would mean that calendars and other application that need to represent future timestamps couldn’t use Unix time.
As I see it, the root cause of this problem is that civil time, when used to express plans more than 6 months into the future, is undefined. Better programming can’t fix that.
The date and time of an event in several years can’t be defined in terms of seconds from now, but you can easily define it in terms of date, time and timezone.
You cannot easily define the time and date of an event in the future in terms of date, time, and timezone! This has nothing to do with UNIX timestamps being stored in seconds.
If you care about the elapsed time, you cannot count the amount of actual time that will pass from now to some date even a year from now. Not with precision at the level of seconds or less. X amount of time into the future doesn’t map to a fixed date, time, and timezone because we’re redefining time constantly with these leap seconds.
FB is right, kill the leap second.
This goes beyond leap seconds. With a fixed date, time, and timezone, the timezone can change, and does with some regularity.
Unless we kill political control of timezones, this will still need to be taken into consideration.
To some extent that’s true, but not generally.
The definition of UTC-5, modulo leap seconds, doesn’t change. In that sense removing leap seconds does allow you to compute future times just fine. If I have a device that I need to check 5 years from now, I know exactly what time that will be UTC-5, modulo leap seconds.
Now if you mean, timezone in the sense of EST/EDT, then plenty of time zones have not changed in well over a century and it’s hard to see them ever changing. Perhaps ET may change by fixing it to EST or EDT, but generally, as countries become more developed they stop making these changes because of the disruption to the economy. Check out https://data.iana.org/time-zones/tzdb/NEWS
So yes, political control of timezones is actually being killed as the economic consequences of changing them becomes severe. Things are slowly freezing into place, aside from leap seconds.
You may have missed this news, then:
https://www.reuters.com/world/us/us-senate-approves-bill-that-would-make-daylight-savings-time-permanent-2023-2022-03-15/
Here’s a list of other political changes:
https://www.timeanddate.com/news/time/
Basically, “18:30 on 2038-01-19 in the host system timezone” is the only more or less well-defined concept of a future date that is useful in practice. When that time comes, a system that is up to date with all natural and political changes can correctly detect that it came.
Applications that deal with arbitrary time intervals in the future like “2.34 * 10^9 seconds from now” should use a monotonic scale like TAI anyway, they don’t need UTC.
Eh, scheduling meetings a year or two in advance can happen, and it could be well defined and useful. But it’s important to note that the further into the future something is happening, the less the accuracy matters, unless it’s astronomy at which point you have TAI or UT1 depending on context.
Except that there is no safe way to compute TAI on most systems.
A GPS reciever costs $1000 at most. If you need precise accuracy, it’s what you’re going to use, and it’s just
GPS_time - 7s
to get to TAI. Big companies run their own NTP pools for reliability, and if you have your own pool, you can run it at TAI.I’ve seen that! It’s what I meant about ET changing its definition. It’s far from done sadly :( The house seems to have abandoned the bill https://thehill.com/homenews/house/3571007-permanent-daylight-saving-time-hits-brick-wall-in-house/
In any case, the problem is with redefining time zones not dropping them.
Can you elaborate on this, I’m really curious why is it so? I was under the impression that if we say a meeting will happen on August 1st, 2050, at 3.30pm CEST, in Bern, Switzerland, not many things can make this ambiguous. If Switzerland stops using CEST, I’ll probably just switch to the replacement timezone. The reason I’m confused is that I don’t see how leap seconds play any role.
It is ambiguous because extra seconds of time may be inserted between now and then. So no one can tell you how long from now that time is (in seconds)
In what situations do you need to know the exact number of seconds to a future (civil) time more than a year in the future?
Civil time is not “undefined”. Definitions of local civil time for various locations may change, but that’s not the same thing at all as “undefined”.
I also don’t generally agree with “better programming can’t fix” – the issue simply is programmers demanding that a messy human construct stop being messy and instead become perfectly neat and regular, since we can’t possibly cope with the complexity otherwise. You slip into this yourself: you assume that the only useful, perhaps the only possible, representation of a future date/time is an integer number of seconds that can be added to the present Unix timestamp. The tyranny of the Unix timestamp is the problem here, and trying to re-orient all human behavior to make Unix timestamps easier to use for this purpose is always going to be a losing proposition.
This is true to an extent, but I think it’s true independently of leap seconds. The timezone, and even the calendar, that will apply to dates in the future are also undefined.
I also think it’s not the whole story. It seems intuitively reasonable to me that “the moment an exact amount of real time from this other moment” is a different type from “the moment a past/future clock read/reads this time”, and that knowledge from the past or future is required to convert between the two. I think we’ve been taking a huge shortcut by using one representation for these two things, and that we’d probably be better off, regardless of the leap second debate, being clear which one we mean in any given instance.
Submitted for discussion. I am not in favor of this proposal.
The proposed solution is to just push off the complexity of accurate solar timekeeping until the next millennium, which could possibly make a Y3K bug for anyone tracking accurate solar time that didn’t adopt the proposal. This is a civil and social problem that could be addressed with standards, treaties, and diplomacy - then accurately modeled in software when the human element is figured out. I’m no expert in timekeeping but Facebook’s history has made it clear that you can’t solve civil and social problems with code alone.
It seems likely to me that human civilization as we know it will be unrecognizable in less than 100 years. Take your pick: climate change, resource wars, genetically engineered plagues, robot uprising, uploading, or (in the best case) a smooth transition to a post-scarcity society.
I vote for not worrying about solar timekeeping, and if it can be put off to Y3K, that’s fine.
We already have a time base without leap seconds; it’s called UT1. Use that if you want that. What Facebook is really saying is they want civil clock to track UT1 instead of UTC. But this essay doesn’t address any of the reasons why there are leap seconds, what consequences there would be to let solar noon drift from clock noon.
UT1 still wanders around as the earth’s rotation slows down or speeds up. TAI is what they want. I think that’s what we should be using too. But Unix time is an abomination that got ubiquitous, and now people have to live with it. But honestly, I think Facebook could gain a lot by transitioning to TAI, and converting to UTC at the outside boundary when needed.
Here’s an old comment of mine from a different post:
The pseudocode (which looks like Go) in the article is explicitly not what the Go
time
package does. In fact, the section on monotonic clocks has the exact same example:So this caused an outage in Reddit but I’ve never heard of outages caused by leap seconds in high-frequency trading. How do they deal with this? I had a friend once tell me that FPGA and advances in HFT were literally pushing up against some physical boundaries of how fast light can pass through materials (I don’t quite understand enough about material science to know how true this is). So I would think if anything needed accurate timing it would be HFT platforms.
Trading usually stops before midnight. Moreover, trading is often stopped when there is a decent chance that something might misbehave, for example trading was effectively stopped for a stock when it’s priced reached close to the integer limit.
Today I learned if you are “smearing on public services, [you] can’t join the public pools.”
More seriously, there is a pretty big leap right here in the last sentence: “ which we believe will be enough for the next millennium.” I haven’t done the math but I have to believe that ice is going to continue to melt pretty fast for the next 100 years and that will slow us down. Even forgetting global warming, 1000 years is a ridiculously long time to be extrapolating.
As long as everyone agrees on a fixed standard, the exact to-within-a-second time of sunrise, sunset, midday or midnight in terms of where celestial bodies are really only matters to astronomers, satellite operators and… I dunno, maybe some birds or something. The difference in sunset/sunrise time from one side of a timezone to another is already way bigger than that, so the aforementioned astronomers and satellite operators already have correct real time based on their exact longitude anyway. Just use an atomic clock reference time and let display software show offsets from it – if the offsets ever get big enough for most systems to actually care.