I think we can defederate that company’s name from our personal vocabulary instances.
The Y2K issue wasn’t just a scare though. If the Devs and IT in general didn’t had a strategy to overcome that ridiculous windows issue things could have gone bad. Media did media things and pushed it to a world ending scenario though.
I’m pretty sure that it wasn’t Windows that was the main offender, but instead legacy systems of all kinds made since 1970, where people were not expecting for their programs to run for more than 30 years.
Surprise! Businesses don’t care whether the code is old, as long as it works - so that data type you store the year in only held two characters, and hard-coded the 19 onto it.
1999 would be written as 99. 19 + 99 = 1999 = computers were happy.
2000 would be written as 00. 19 + 00 = 1900 = computers went to shit
Yeah good explanation. I was too young to had any further knowledge about this issue way back and only saw it manifesting when I had to adjust my windows 95 clock :)
Next doom and gloom scenario is 2038, when poorly maintained *nix systems will think it’s Jan 1st, 1970.
I’ll be pushing 68. Hopefully retired or dead by then.
… I’ll probably still be working, though…
Eh, it only being an issue for 32-bit systems will hopefully help. But of course somebody will still be running that in 15 years.
Lots of financial institutions are still using software programmed with Cobol. My father graduated with a software engineering degree for Cobol in the mid-1970s. My company provides external API for customers who still use green screen terminals. Of course there will be people running 32-bit systems. And I’m sure there will be well-paid jobs for fixing any date overflow on those systems.
It wasn’t Windows, as someone else already explained, but yeah general media spread misinformation as usual when it comes to technology.
I work in IT and I was there, it was a serious problem that, if not fixed, would have indeed ended up in worldwide disaster, but we knew exactly what it was many years earlier, and exactly how to fix it, and we did so nothing actually happened obviously.
Media spread fear for nothing, instead of accurately reporting the situation and all the hark work IT people were doing all over the world to make sure everything would be fine.
In a way, the media hype was not completely bad. It helped ensuring there was budget to fix all those systems.
Preparedness paradox - if effective action is taken to mitigate a potential disaster, the avoided danger will be perceived as having been much less serious because of the limited damage actually caused.
Very relevant in the context of COVID - “we’re not seeing spikes, why are we still locking down and masking up?!” - and a significant driving factor feeding into those radical anti-COVID-protection “no new normal” ideologies.
It’s weirdly comforting knowing media hyperbole isn’t new or unique.
added it to my word filter in connect, yesterday. i was barely able to read your post
Added what?
I think he meant to reply to the main thread (which he did with the same comment) but didnt delete this (or maybe it didnt federate?)
I mean in the media sense. There are some actually bad consequences. But the hype on here feels sensationalized.
I made so much money in the 90s working on Y2K stuff. I don’t know how bad it would have been if we didn’t, but people like me fixed a lot of shit in the late 90s to make sure things went smoothly.
You are one of our unsung heroes.
Thank you, it was mostly very boring work.
About 50% of the job was reading and understanding source code and checking nothing went wrong with Y2K. We needed 100% coverage, with different steps like manually reading (two or even three eye principle), manual testing and writing small little test scenarios and scripts that could be gone through multiple times to check everything. Even when you had to check 5000 lines of code that did nothing with dates, it still needed to be check thoroughly by multiple people, tested, documented etc.
I thought it was insane to put so much effort into code that would work just fine after 2000. But my manager said the customer doesn’t pay to fix the software, the customer pays for the check, for the signature as he put it. That made it still boring, but at least I could understand it better.
There was al lot of FUD around Y2K, a lot of companies could just have hired some programmer to get familiar with the code in December and put in some fixes as they occurred in January. But with all the media attention and buzz going around about THE Y2K bug, customers were getting anxious. They were told it would be pandemonium in January and getting hold of any kind of software developer would be impossible. The costs would be through the roof and companies would fold if they didn’t fix their shit.
My first Y2K project was in 1997 and we had a team of 15+ people, with only 3 actual software engineers like me. The rest was legal staff, project management, administrative staff etc. With some projects there was actual hardware or firmware involved, but most stuff was pure software. Rates for Y2K projects were also huge, it was like some unwritten untold rule of software at that time, all Y2K projects get double the staff, double the price and double the time.
I’ve even worked on a project where we were the second team, a backup team if you will. Another company had done the work before us, but the customer wanted to make double sure, so they hired us to re-do the entire project. We only found out later this was the case, they told us nothing not to influence us. That seemed like a crazy waste of money for me, but in those days it was somehow possible.
Did you work on anything that would actually have been disasterous if not fixed?
Nope, mostly accounting software, work registration, stuff like that. It would have been disastrous for our clients, but for the world as a whole it wouldn’t really matter.
added it to my word filter in connect, yesterday. i was barely able to read your post
the other naughty word rhymes with Kobra Kai. how many other words are bleeped/what is the post for you?