Having worked in embedded, I'm perfectly comfortable with horrible kludges that nonetheless solve the problem:
In 2036, factory reset to wipe all state, and keep using the same ancient kernel with a tiny hotpatch to assume 1/1/2035 as time base. If your kitchen appliance is still around by the time that is a problem, you'll get an award ;)
But don't tell anybody, I plan to hawk my consulting skills from 2030-2038 to pad my retirement income. Just like the good old Y2K days & COBOL.
How does the hot patch solve the logistical challenge of installing firmware updates on thousands of devices that generally receive no updates at all?
Am I supposed to pull my microwave out of the wall and ship it back to GE? Does it have a USB port for installing this hot patch? How does GE tell me about it?
Do you care about your microwave showing the correct date? Does your microwave even keep track of the date?
Devices like your microwave won't magically stop working in 2038. For most non-networked and, in 2038, ancient appliances, there shouldn't be that much of a problem.
I think most people replying haven't shopped for appliances in the past 5 years. For example, you have to choose the most basic HVAC system these days to avoid communicating systems with built in smart controls. A fair number of fridges now come with integrated screens. A lot of new homes come with ring door bells and smart security systems. Even washing machines and driers now have built in systems for notifying you when your clothes are done and other such things.
Pedantics would say "well just don't buy this IOT crap," but the problem is that only the bottom line devices come this way in a lot of cases.
This is a very upper class problem. You are vastly over estimating the ratio of smart to dumb devices.
What's more, most smart devices will not last until 2038. Planned obsolescence is a standard feature of smart devices product development plans today, especially since fewer and fewer of them are doing anything without cloud integration. If this trend continues, I would bet a kidney that no smart product launched in 2025, regardless of target audience, will still be operating (and be incapable of receiving updates automatically) in 2035.
Protocols would change way before then. 2038 is 19 years in the future. What devices from 2000 are still running, that wasn't running long before 2000?
Dibs on your kidney.
We've bought one of the cheapest dishwashers we could find, and it came with a feature to connect it to an app using NFC. Granted, the whole EU is "very upper class" compared to most of the world, but it's still quite a huge number of machines.
I believe it. We've seen it happen with TVs for sure.
I think we're all both underestimating and overestimating the problem :)
There are way more "smart" devices (devices with some kind of embedded software and clocks) than we assume, and there are way less of these devices than we fear.
It's still gonna be a problem though.
Shouldn’t, but I have to deal with things that “shouldn’t“ happen all the time.
Remember, the Zune wouldn’t boot in 2009 because 2008 was a leap year, and that wasn’t tested.
Yeah, but I specifically said non-networked appliances. Most microwaves I know don't even let you input the date. For those that do, and that you still want to keep in 2038, you can just enter a fake, old date.
I understand that there might be some smart microwaves with Wifi and whatever (setting all sentiments about that aside), but those are not what are meant.
I can set the date on my microwave. I just set it to 2040 and it seems to be working fine, so I guess if anyone's really worried about this buy a Kenmore Elite while you still can.
Trying to imagine why you would want the date to be on a microwave
Maybe so it can display the correct time when daylight savings rolls around?
(I'm just trying to guess. Personally I think it's an absurd feature, and on the rare occasion I've had a microwave that even displayed the time, it was always wrong anyway)
First thing that comes to mind is maybe more complicated than anything you can do, but something like running at a certain time 5 days a week to heat up a meal for your kids to eat after school while you're at work. You'd prep said meal and let it sit in there, heated right before it would've been eaten. Although there are a couple issues I can already think of. The microwave is not a fridge. This limits food that could sit there all day. Also, the kids could hit 'start' when they get in the door, so main appeal would be something cooking plus cooling enough to eat by the time they get home, or something with a long cooktime, which would maybe be mashed potatoes or something, and we're still just talking around 5 minutes most likely.
I don't think the Zune was really a networked device. It just literally wouldn't turn on for a day.
Having a date on your microwave could be useful for things like updating the clock for daylight savings time. Or it may be that the clock overflow causes the entire firmware to crash in 2038, bricking an otherwise still functioning device.
> updating the clock for daylight savings time
That feature is unlikely to work well, tzdata is updated multiple times a year.
How expensive would putting a WWV shortwave receiver into a device? 
"the station's future is in doubt, because it, along with WWVB and WWVH, has been recommended for defunding and elimination in the NIST's Fiscal Year 2019 budget request."
Zune wouldn't boot because the real time clock keeps time in MM:DD:YY HH:MM:SS which is utterly wrong.
And yet, they're all that way! Hardware RTCs are horrible.
(Well, almost all of them are that way. There are a few, precious few, sane parts that are just a 32-bit counter counting up at one second per second. These, of course, are rare, obscenely expensive, large, and missing useful features like trimming. Sigh.)
The NXP Kinetis micro-controllers have a sane RTC that counts seconds as a 32 bit number and fractional 32khz ticks. Bonus survives a hardware reset.
STMicro's RTC's are the usual POS everyone loathes. All the RTC IC's are also just like that.
I’m willing to accept “the device will function without update” and “the device will be outside its useful life in 2038” as reasons to not be concerned. I am not willing to accept “install a hot patch” as a solution unless it comes with an explanation of how the hot patch will be distributed.
Lets get real. Most of such if not all things will fail and have to be replaced before then. Modern engineering ensures a death date agreeable with the financial department.
> Most of such if not all things will fail and have to be replaced before then.
I'm looking back at the things that I own today that I owned in 2000. (Or, if you're young, consider the things your parents own that they owned in 2000). It's not a ton, but it's not zero either. And we're equidistant to 2038 and to 2000.
I think you (and most of the other replies below) miss the point. The newer the device, the shorter the lifespan. You year 2000 device was built to last much longer than the average equivalent device built today. Planned obsolescence is a staple of consumer society, you have to have a guaranteed reason to replace the item. Breaking down will do that better than anything else.
20 or 40 years ago most people expected their devices to last a lot longer than today. Now if you TV breaks down after 5 years most never bother fixing it and just think "I was due for a new one anyway".
Exactly. My 2013(?) microwave started failing in 2018 - I keep it’s power strip off when it’s not in use and frequently have to cycle it. It replaced a microwave from possibly the mid 90s around 2013. Reliability engineering seems to have matured for planned life cycles.
"Most of such if not all things will fail and have to be replaced before then."
That's what they always say but when the time comes the old stuff is still around. Happened with Y2K stuff, happened with old Cobol code.
We have an old VHS camcorder my father bought in the mid '80s. It still works, but the timestamp overlay can't be set to display a year beyond 1999.
Most embedded RTCs are still limited to storing two digit years. There will be a minor Y2.1K issue because of this.
I'm still using a microwave I bought in the late 1980s.
Last time I read a blog post on 2038 one of the main concerns were cars.
I’d argue that for environmental reasons such cars should be taken off the roads much sooner - through governmental Buy backs, inclusive.
Does your microwave actually know what the date is? All that I've seen just show the time.
Why not ship devices with a patch now, that activates any (every?) time the system clock reaches 2036?
I gotta say that my middle sister made a nice living polishing COBOL turds for something like a decade. We all thought it would be a sorta of last hurrah but it turned out to be lesson in not underestimating how much folks want to do things in the most painful way possible.
Anyway, I'm sorta skeptical that a whole lot of things that have been running deeply embedded code for a while now will still be running in 2038. Tech changes faster now, hardware isn't designed to last many decades much any more (at least it's the exception rather than the rule), and there's so much of industry that is going to have to change to overcome future challenges that I expect a great many things to taken out of service before then.
Thanks, but I'm not looking forward to the increased mental health bill that this will bring (COBOL polishing I mean)
Kitchen appliances are the easy and almost ignorable bit. What about:
* controls for railroad track switching
* hardware in air traffic control radars
* hardware that controls elevators
* firmware for traffic lights
* guidance systems in old oil tankers
* pipeline control systems
* oil refinery systems
* industrial food refridgeration systems
Embedded systems are everywhere, and they’re not all trivially dismissed.
Sssssh, you just told the world my retirement plan!
Yes, what most programs are interested in is duration from boot time or process start time. Not some human level duration from some arbitrary point in the past.
Programs should just use time since boot. And only bother to convert when offered for human or external consumption.
I dont think is even remotely true; certainly is patently fall for anything involving a financial time. Your claim might be true for logs, but even then, its far more relatable to the person looking at the logs if it is wall clock time. I dont want to have translate every log statement I'm looking at from when the system or process started.
Personally, I deal with trading systems in the US; we record all timestamps in EPT down to microsecond resolution. Currently only deal with US equities, so dont need to worry about DST. I would prefer timestamps were in UTC, but well, legacy systems...
I cover all that in my mention of human consumption.
The point is that there is no necessary _strong_ _low-level_ connection between 'how long ago something happened on this computer' and how many years since the birth of JC.
There's still the problem if what if you're looking at a log after a system restarted/crashed. Do you have the previous start time logged somewhere?
Personally, while I appreciate your arguments, I'm not sold on them.
You might enjoy this, which I wrote a few years ago. The project is dead now. https://robigalia.org/blog_drafts/2016/12/23/what-time-is-it...
I wonder how many of these systems would really be sensitive to the clock overflowing to 1901. HVAC, kitchen applications, information signs, traffic lights... they all seem relatively low-risk to me. As long as things still happen in regular intervals, it's hard to imagine a crisis. (I would be surprised if many of these even have synced clocks in the first place, I can imagine them happily thinking it's somewhere in the 1980s and nobody is the wiser.)
Does your system need to connect to a service on the internet? Does that service use an SSL cert?
If so, it almost certainly has a not-before time, and overflowing back to January 1st, 1970 will be before that not-before time. And so any behavior relying on that IoT device connecting to a service will fail.
If a device isn't receiving any updates, SSL would be an issue even without the 2038 problem - eventually all the trusted root certs will be expired.
There is another theory which states that this has already happened.
Recently I brought up an emulated PPC running Mac OS 9. The included copy of Microsoft Internet Explorer couldn't connect to any site that used https -- and virtually everybody uses https these days.
That version of Internet Explorer did not support SNI, which is used by basically everything these days.
Not supporting SNI will get you a certificate error, but you’ll still be able to connect. What’s more likely is that old IE only supports SSLv3, which is disabled on most servers in light of POODLE.
And some certificates with long life are already post-2038
I certainly hope traffic lights are never connected to the internet.
I’d always assumed they must be connected to support the ability of fire engines to setup route plans or the central traffic management teams to be able to override light sequences.
I see now there are other novel ways of supporting preemption for emergency vehicles, although I’m still unconvinced that’s what our city uses as the lights often preempt several minutes in advance and from over a kilometre away.
Most US preemption devices are just a visible (and sometimes optional infrared) stroke at 80Hz. Visibility to the sensor/receiver device is largely subject to prevailing light (bright day, more challenging, night/shade, then it's easy to pick out the strobe).
-- fire/ems provider.
Question, since I've always been curious about this: How difficult would it be to make your own preemption device? Are such things available on online stores?
I'm assuming it's illegal to use one obviously, but I wonder what precautions are used to prevent unauthorized usage, if that can even be disclosed to the public. What's stopping the average joe from getting one from Alibaba?
Honestly, pretty little, other than the risk of being observed. There may be systems that allow for authentication, but I've not seen them (though I'm not an expert).
There is an encoding standard for prioritization that some larger systems use, that mass transit may utilize, but can be overrode by Fire/EMS/LE.
In Queensland, Australia it's all an (internal) networked thing, relies on a lot of hardware interacting on IP networks. I can't imagine what'll happen in the 2038 situation - hopefully they've got a good vendor :)
They don't have to be connected to the Internet for that functionality.
REST API for my city's traffic lights (and associated detectors): http://trafficlights.tampere.fi
Realtime map someone made with that: http://jouni.kapsi.fi/trevalot/
I think you're already a bit late.
Didn't you ever see the Italian Job circa 2003?
Fictional or not, cities are hyped up on 'smart infrastructure', sorry to deliver the bad news.
Even if the system isn't calendar aware, if all of a sudden you have negative timestamps you can have strange boundary conditions where loops might run forever, hanging the system.
I will never forget discovering that "SELECT FROM_UNIXTIME(-1)" would instantly crash MySQL.
I imagine that most of those applications are very much tied to the calendar — traffic lights and HVAC systems need to behave differently on different days.
Trying to minimize the amount of broken code that's still running in 2038 is a big part of why people are working hard to fix the problem now.
Maybe we could set them to 1907 and remain happy that near-future leap years and weekdays will align perfectly well. At least until 2100 needs to be skipped as a leap year ;)
Unix epoch: the number of seconds passed since 1 January 1970 and storing it as a signed 32-bit binary integer
The minimum representable date turns out to be on 1901 and zero is a valid value (1 January 1970)... I wonder what problems people with that birth date have? Similar to the person named Null, or Bobby Drop Tables?
The first-generation Zune had a leap year bug that caused it to get stuck in an infinite boot loop on December 31, 2008. Not exactly the kind of thing you want happening to traffic infrastructure.
Even if it has no issues rolling over, a big problem could be that anything that works on a weekly cycle (I imagine traffic lights, HVAC, etc have different schedules for workdays compared to weekends) will skip some days, as Jan 19th 2038 is a Tuesday and Jan 13th 1901 is a Friday.
Following up with this, it is highly unlikely the overflow will happen at midnight, or at the top of an hour. Even if a timezone or clock adjustment is possible on the system leap years and other calendar offsets will be different that can cause some continuing frustration for any workaround with an out of sync clock.
You never know, some devices may start incorporating AI algorithms that make behaviors based on past data, so a consistent timeline can become important even if there should be no inherent reliance on absolute dates.
The rule I follow is the same one I had for Y2K. If there is no interface to enter a date, you don't have to worry about it.
For example, on the sensationalized "documentaries" we had leading up to Y2K, they showed a stream of appliances and gadgets that had processors in them -- hair dryers, microwave ovens, dishwashers, food processors, etc saying they all would be vulnerable. Some of these display the time, but I haven't run across a microwave yet that accepted a date input.
These days, the "date input" tends to be the internet connection.
This is great for thinking about the non-internet connected devices. What's also interesting about this idea is that we can test what 2038 will be like by slowly shifting the time for certain networks forward to see if a subset of stuff will fail. If it does, we can resort to turning off systems.
Is "y2k38" a common way to reference this issue? That doesn't save any characters over "y2038".
It also doesn't cost any more characters, and instantly calls to mind the y2k issue to which it is related.
Traffic lights were a concern for Y2K... but it turns out that in the real world, traffic lights just need a day-of-week so they continue to work fine.
Ok but what happens when the date overflows? Will they still know Monday is Monday or will they start thinking Monday is Sunday?
I don't think you would need a complete date for that. I would think the modulo 7 of a day counter, with a little extra logic to handle leap year, would be enough.
Mod 7 is not a bad idea but if that is the algorithm why do you care about leap years?
I have been working in the embedded world for some time. What worries me is that even new embedded systems come with ancient kernels and stdlibs, most of which won't be y2k38 compatible.
Many of these devices will still be operating in 2038, but long forgotten. Think HVAC control systems, kitchen appliances, security systems, etc. Cars, trains, information signs, traffic lights, many of these public infrastructure systems will fail.
And these systems are very, very hard to update, if possible at all. We'll be having a lot of e-waste in 2038.
Aha, the real reason behind the 2008 crisis!
The 32-bit Unix 2038 end has already been an issue in some fields. For instance, in finance, it was already an issue in 2008 when calculating risk, payments, etc on say a 30 year bond.
> I wonder how many pre-2000 kernel versions are being used in devices right this moment.
That one is simple to answer: too many.
That doesn't change that it's a solid 19 years away.
Exactly what programmers were saying in 1981.
Were they wrong? y2k was a non-event, unless you were a consultant cashing in.
"Fixing the brakes on my car was a non-event, unless you count the mechanic cashing in."
Routine automative work is a good analogy. There was programming work to do, people did it in time, and life went on.
Well don't insult the people that fixed it and kept it from being a giant problem.
It wasn't a non-event, because it took a lot of effort to make everyone aware of the need to fix the bugs.
It's like routine maintenance except that every single device involved is going to hit problems right about at the same time, with almost no signs of trouble until the exact second it falls over. So it's not harder to fix, but if it's not fixed then it hits vastly harder.
My dad worked for an IT consulting firm in that era - so Y2K literally put a roof over my head and clothes on my back. I didn't see it as insulting. It was important work, everyone understood the urgency, worked diligently on fixing it, and got the job done with a minimum of fuss. The fact that so many people today think of Y2K as an overrated problem is testament to how good of a job everyone did.
I was one of those people fixing it. How is it insulting to say that the biggest impact was the enrichment of the people working on it?
"non-event unless you were cashing in" pretty strongly implies it wasn't necessary.
They were right!
"This may seem like a distant problem, but, as Tom Scott recently observed, the year-2038 apocalypse is now closer to the present than the year-2000 problem."
Wow, I knew it was approaching soonish, but that's an eye-opening way of putting it.
I wonder how many pre-2000 kernel versions are being used in devices right this moment.
Also, how "X" will always be enough. It May be sufficient for the foreseeabke future, but software engineers are horrible about underestimating how long the software we write will stick around.
Pretty sure I made some short sited decisions as an intern, and most of that code is approaching 20 years old, and I bet it hasnt been fixed or rewritten, because for the time being, it works.
When it comes to the kernel, "We don't break userspace" is explanation enough. Feel free to add swearwords in English or Finnish depending on how you think 68-year-old Linus will feel about the matter.
“Supporting 32-bit time system calls after 2038” will serve future generations as an example of the lengths that developers are supposed to go to in order to ensure backwards compatibility, even in the face of apparent absurdity.
Alternately, it will serve as a joke about how scared everyone was of breaking things written with our primitive and brittle 21st century programming techniques.
In other news there's a GPS epoch arriving this April: https://www.spirent.com/blogs/positioning/2018/january/gps-r...
Yeah, huge shout out to Initech specifically for their amazing work in that field.
It’s all thanks to hardworking software people that Y2K wasn’t.
I like stories like this. Finding where the overflow happens could be a lot of fun. How did you figure it out? Did it take a long time?
We launched a crime based RPG game on iOS back in the day that had a leaderboard which tracked in game net worth (cash, investments, etc).
I still remember the day we woke up to multiple emails and lawsuit threats from our top player whose net worth suddenly went negative on the leaderboard.
Our players were very competitive and loved pushing the boundaries by trying to find vulnerabilities, shortcuts, etc so we assumed the worst and thought we had a bug in our system! Turned out to be our JSON parsing library was using integerValue on our NSNumber instances causing the ints to overflow. It was a tough week waiting for Apple to review the app while our players lost faith in us.
Lesson learned :)
(Luckily the player and our community was understanding and stay around)
Personally I'll wait for SO_TIMESTAMP_NEW_NEW_FINALTHISTIME.
It only makes sense from a shortsighted "today's" perspective. What happens if there is another API change? Do you call the new new suffix _NEWER? _NEWNEW?
Thanksfully 64 is the endpoint. "We'll never need 128bit time_t struct's!" Famous last words. But really, we don't. We don't need to mix low precision with high precision times, "one for all" . Calculating in ns does not make sense in the year scale.
I really don't like the _NEW suffix, as in SO_TIMESTAMP_NEW, opposed to the normal SO_TIMESTAMP64 naming scheme, as in the function names. But it renders the existing names as old and outdated, so it does make some sense.
This is only for 32-bit systems. 64-bit systems have always used a 64-bit timestamp.
The difficulty is that the ABI defines a 32-bit timestamp on 32-bit kernels so that can't easily be changed without breaking backwards compatibility. That's what this post is about.
So for desktops and servers this has been solved for years already. I don't know how prevalent 32-bit is in embedded versus 64-bit.
The popular Raspberry Pi platform only started shipping with a 64-bit ARM cpu in 2016, and ARM chipsets with 64-bit support only started appearing around 2012/2013. So I'd expect many new embedded platforms to continue using 32-bit systems well into the next decade.
There are still probably billions of devices made each year with 8 bit microcontrollers that are decades old designs. We won't be free of 32bit ARM until the turn of the century.
On the other hand, current mysql has this problem now, still unfixed, even running in 64 bit environments.
> Some of the final steps in this transition for the core kernel have been posted, and seem likely to be merged for 5.1.
This doesn't sound as good as they wanted it to sound. I thought they were already save.
On the other hand, planned obsolescence could help and nobody is using 20 year old kernels in 2038 anymore. Hopefully.
I found this article better explains the issue: https://fossbytes.com/year-2038-problem-linux-unix/
Hopefully, this will be humanity's last problem of this kind.
Or, now that I think of it, hopefully it won't be.
But what are we going to do about the year-292,277,026,596 problem!? (And what about the year-2,147,485,547 for systems that have to do date computations!?)
Really? I'm not sure if this is bait or not, but 64b time_t extends the epoch over 200 billion years into the future. I figure we'll handle it in Linux 7.0 or 8.0 at the earliest :P
It's probably not bait. It's easy to intuitively think 64 bits is only double. I do these calculations regularly as security professional (password strength for example) but not everyone does.
Well, 64 bits is often used with nanoseconds, and 32 bits is in seconds, so it's a bit more complex than it seems.
The 64 bit clock will overflow about 292 billion years from now. That's long enough for me.
Worrying about 64-bit time overflow is like worrying about IPv6 address exhaustion.
Running out of IPv6 addresses is more likely than you think:
I'm going to set my cryogenic wake-up for then and make a whole bunch of money.
Be careful, it the cryogenic controller might overflow and wake you up right after the big bang!
> Worrying about 64-bit time overflow is like worrying about IPv6 address exhaustion.
And for those following along at home IPV6 is a 128 bit value. ~2^93 is enough to individually address every subatomic particle on Earth.
> ~2^93 is enough to individually address every subatomic particle on Earth.
IPv6 address exhaustion is very unlikely, but it's more likely than some may have thought about.
Only 64-bit of the 128-bit address space was fully utilized, it means the utilization rate is less than 0.0000000000000000055%. And no, it's not a bug, it's a feature. The huge address space was intended to make address assignment, configuration, and routing easier. That is, in the real world, the first 64-bit part becomes the network identifier, and the last 64-bit becomes the device identifier. Ideally, everyone's home, business, institution, datacenter, etc, will get a unique, globally routable 64-bit network prefix.
So no, every subatomic particle on Earth will not get its own IPv6 address. But, everyone of the 7 billion people on Earth can have prefixes for their 263,524,915 personal computer networks.
Where did you get 2^93? I seem to get 3.6 × 10^51 which is about 2^171. Meanwhile the 128-bit addresses are only enough to give one to each cubic-micron nanobot on Earth (https://www.xkcd.com/865/).
While 32-bit time_t lasts for roughly 68 years, 64-bit time_t lasts for roughly 292270958501 years.
Well, what if you want to measure small fractions of a second? If your timestamps have significantly higher resolution than a nanosecond, you've used up your extra 32 bits. And there is technology that operates on much shorter intervals, like about 10^-18 seconds. While it may not be very relevant today, if you really want to future proof things for the next 1000 years, I think you need around 90 bits or more.
"12 attoseconds is the world record for shortest controllable time"
After further consideration, the ultimate timestamp would be able to denote any point in time measured in Planck units (10^-44 seconds) between the beginning of the universe and its ultimate heat death in about 3 x 10^21 seconds. Therefore 218 bits should be enough for anyone.
How much time does moving to 64 nits buy us? it's likely considerable, but I cant help but imagine a time when we have to address this problem for a third time. It seems to me like a more robust solution is possible.