Similar with Y2K — it was only a nothingburger because it was taken seriously, and funded well. But the narrative is sometimes, “yeah lol it was a dud.”
All this hysteria over nuclear weapons is overblown. We’ve known how to build them for 75 years yet there hasn’t been a single one detonated on inhabited American soil. They’re harmless
The question is, what will happen in 2038 when y2k happens again due to an integer overflow? People are already sounding the alarm but who knows if people will fix all of the systems before it hits.
It’s already been addressed in Linux - not sure about other OSes. They doubled the size of time data so now you can keep using it until after the heat death of the universe. If you’re around then.
debian for example is atm at work recompiling everything vom 32bit to 64bit timestamps (thanks to open source this is no problem) donno what happens to propriarary legacy software
Obviously new systems are unaffected, the question is how many industrial controllers checking oil pipeline flow levels or whatever were installed before the fix and never updated.
Being somewhat adjacent to that with my work, there is a good chance anything in a critical area (hopefully fields like utilities, petroleum, areas with enough energy to cause harm) have decently hardened or updated equipment where it either isn’t an issue, will stop reporting tread data correctly, or roll over to date “0” which depending on the platform with industrial equipment tends to be 1970 in my personal experience. That said, there is always the case that it will not be handled correctly and either run away or stop entirely.
AfaIk that’s not entirely true, e.g. Debian is changing the system time from 32 bit integer to 64 bit. Thus I assume other distros do this as well. However, this does not help for industrial or IOT devices running deprecated Unix / Linux derivatives.
industrial or IOT devices running deprecated Unix / Linux derivatives
This is my concern, all the embedded devices happily running in underground systems like pipes and cables. I assume there are at least a few which nobody even considered patching because they’ve “just worked” for decades!
It’s called the prevention paradox: It’s when an issue is so severe that it is prevented with proactive action, so no real consequenses are felt so people think it wasn’t severe in the first place.
Case in point: Measles. It was a thing when I was a kid. Then it wasn’t. Now my kids have to deal with Measles because we can’t teach scientific literacy.
Y2K specifically makes no sense though. Any reasonable way of storing a year would use a binary integer of some length (especially when you want to use as little memory as possible). The same goes for manipulations; they are faster, more memory efficient, and easier to implement in binary. With an 8-bit signed integer counting from 1900, the concerning overflows would occur in 2028, not 2000. A base 10 representation would require at least 8 bits to store a two digit number anyway. There is no advantage to a base 10 representation, and there never has been. For Y2K to have been anything more significant than a text formatting issue, a whole lot of programmers would have had to go out of their way to be really, really bad at their jobs. Also, usage of dates beyond 2000 would have increased gradually for decades leading up to it, so the idea it would be any sort of sudden catastrophe is absurd.
The issue wasn’t using the dates. The issue was the computer believing it was now on those dates.
I’m going to assume you aren’t old enough to remember, but the “only two digits to represent the year” issue predates computers. Lots of paper forms just gave two digits. And a lot of early computer work was just digitising paper forms.
You’re thinking of the problem with modern solutions in mind. Y2K originates from punch cards where everything was stored in characters. To save space only the last 2 digits of the year because back then you didn’t need to store the 19 of year 19xx. The technique of storing data stayed the same for a long time despite technology advancing beyond punch cards. The assumption that it’s always 19xx caused the Y2K bug because once it overflows to 00 the system doesn’t know if it’s 1900 or 2000.
Some of the computers in question predate standardizing on 8 bits to the byte. You’ve got a whole post here of bad assumptions about how things worked.
You do realize that “counting from 1900” meant storing only the last two digits and just hardcoding the programs to print"19" in front of it in those days? At best, an overflow would lead to 19100, 1910 or 1900, depending on the print routines.
Oh boy you heavily underestimate the amount and level of bad decision in legacy protokoll. Read up in the toppic. the Date was for a loong time stored as 6 decimal numbers.
And then there is PIC 99 in Cobol. In modern languages, it makes no sense, but there is still a lot of really old code around and not everything is twos complement, especially if you do not need the efficiency in memory and calculations.
Similar with Y2K — it was only a nothingburger because it was taken seriously, and funded well. But the narrative is sometimes, “yeah lol it was a dud.”
All this hysteria over nuclear weapons is overblown. We’ve known how to build them for 75 years yet there hasn’t been a single one detonated on inhabited American soil. They’re harmless
You even dropped a few accidentally and nothing happened! Complete duds these things really
Yeah but not all people live on American soil…
It’s the American tradition to ignore that
The question is, what will happen in 2038 when y2k happens again due to an integer overflow? People are already sounding the alarm but who knows if people will fix all of the systems before it hits.
It’s already been addressed in Linux - not sure about other OSes. They doubled the size of time data so now you can keep using it until after the heat death of the universe. If you’re around then.
Finally it’d be the year of desktop linux with all the windows users die off
This is the funniest comment I have ever read here. Thank you.
debian for example is atm at work recompiling everything vom 32bit to 64bit timestamps (thanks to open source this is no problem) donno what happens to propriarary legacy software
I think everything works in windows but the old windows media player. You can test it by setting the time in a windows VM to 2039.
Obviously new systems are unaffected, the question is how many industrial controllers checking oil pipeline flow levels or whatever were installed before the fix and never updated.
Being somewhat adjacent to that with my work, there is a good chance anything in a critical area (hopefully fields like utilities, petroleum, areas with enough energy to cause harm) have decently hardened or updated equipment where it either isn’t an issue, will stop reporting tread data correctly, or roll over to date “0” which depending on the platform with industrial equipment tends to be 1970 in my personal experience. That said, there is always the case that it will not be handled correctly and either run away or stop entirely.
As a future boltzmann brain, I agree.
2038 is approaching super fast and nobody seems to care yet
At the rate of one year per year, even.
For each second that passes we’re one second closer to 2038
Except for leap seconds. Time is the worst to work with :(
omg
deleted by creator
AfaIk that’s not entirely true, e.g. Debian is changing the system time from 32 bit integer to 64 bit. Thus I assume other distros do this as well. However, this does not help for industrial or IOT devices running deprecated Unix / Linux derivatives.
This is my concern, all the embedded devices happily running in underground systems like pipes and cables. I assume there are at least a few which nobody even considered patching because they’ve “just worked” for decades!
Or like… PLANES! Some planes still update their firmware using floppy disks
They do at least get updates though, and they’re big enough that they don’t get forgotten!
Removed by mod
Well that’s justifiable. We’re not sure if we’re even going to make it to then
I can’t remember the name but I think this is some kind of paradox.
Like the preventative measures we’re so effective that they created a perception that there was no risk in the first place.
It’s called the prevention paradox: It’s when an issue is so severe that it is prevented with proactive action, so no real consequenses are felt so people think it wasn’t severe in the first place.
Case in point: Measles. It was a thing when I was a kid. Then it wasn’t. Now my kids have to deal with Measles because we can’t teach scientific literacy.
that waste of effort cold war… /s
“Lol Elon rocket go boom, science isn’t real” is also happening
Stupid people just think they’re the smartest ones in the room now
Y2K specifically makes no sense though. Any reasonable way of storing a year would use a binary integer of some length (especially when you want to use as little memory as possible). The same goes for manipulations; they are faster, more memory efficient, and easier to implement in binary. With an 8-bit signed integer counting from 1900, the concerning overflows would occur in 2028, not 2000. A base 10 representation would require at least 8 bits to store a two digit number anyway. There is no advantage to a base 10 representation, and there never has been. For Y2K to have been anything more significant than a text formatting issue, a whole lot of programmers would have had to go out of their way to be really, really bad at their jobs. Also, usage of dates beyond 2000 would have increased gradually for decades leading up to it, so the idea it would be any sort of sudden catastrophe is absurd.
The issue wasn’t using the dates. The issue was the computer believing it was now on those dates.
I’m going to assume you aren’t old enough to remember, but the “only two digits to represent the year” issue predates computers. Lots of paper forms just gave two digits. And a lot of early computer work was just digitising paper forms.
I remember paper forms having “19__” in the year field. Good times
Removed by mod
You’re thinking of the problem with modern solutions in mind. Y2K originates from punch cards where everything was stored in characters. To save space only the last 2 digits of the year because back then you didn’t need to store the 19 of year 19xx. The technique of storing data stayed the same for a long time despite technology advancing beyond punch cards. The assumption that it’s always 19xx caused the Y2K bug because once it overflows to 00 the system doesn’t know if it’s 1900 or 2000.
Some of the computers in question predate standardizing on 8 bits to the byte. You’ve got a whole post here of bad assumptions about how things worked.
You don’t spend much time around them, do you?
You do realize that “counting from 1900” meant storing only the last two digits and just hardcoding the programs to print"19" in front of it in those days? At best, an overflow would lead to 19100, 1910 or 1900, depending on the print routines.
Oh boy you heavily underestimate the amount and level of bad decision in legacy protokoll. Read up in the toppic. the Date was for a loong time stored as 6 decimal numbers.
And then there is PIC 99 in Cobol. In modern languages, it makes no sense, but there is still a lot of really old code around and not everything is twos complement, especially if you do not need the efficiency in memory and calculations.
Look some info on BCD or EBCDIC.