The blatant incompetence of like every single computer nerd in the world.
All because not one fuckin person thought about what happens when you run out of number space for the date.
I take it you were still in nappies back then. Because it wasn’t incompetence at all, it was a simple trade off. Storage space and memory on early computers was expensive and very limited. This is rather like IPV4 - who would have guessed WAY back then that there would be more than a bajillion individual IP devices?
For Y2K, we’re talking mainly 1960s-1970s, so it’s partly “hey, we have 40 years to deal with this”, which programmers in the late 70s and 80s were well aware of and began dealing with. The issue with the Y2K scare was way over-hyped… it was really just legacy systems that were any kind of actual problem. The whole “turn off your computer” thing was utter nonsense.
I remember hearing all the hype and decided to take my poor 486 running win 3.11 and checked the date in the bios, went further in the future than that pc lasted, set the date to 2000 then booted to windows. the only issue I saw was that the stuff I saved in 2000 had the date 19;0. Fun times
This is rather like IPV4 - who would have guessed WAY back then that there would be more than a bajillion individual IP devices?
Except the response to this wasn’t “Let’s figure out a timecode that uses letters instead of numbers and has accuracy down to the femtosecond and enough space for 10^20 years.” IPv6 was massive overkill.
I still can’t believe this shit was real.
Like the massive oversights that happened. The blatant incompetence of like every single computer nerd in the world.
All because not one fuckin person thought about what happens when you run out of number space for the date.
I feel like someone should’ve been excommunicated over this shit.
I take it you were still in nappies back then. Because it wasn’t incompetence at all, it was a simple trade off. Storage space and memory on early computers was expensive and very limited. This is rather like IPV4 - who would have guessed WAY back then that there would be more than a bajillion individual IP devices? For Y2K, we’re talking mainly 1960s-1970s, so it’s partly “hey, we have 40 years to deal with this”, which programmers in the late 70s and 80s were well aware of and began dealing with. The issue with the Y2K scare was way over-hyped… it was really just legacy systems that were any kind of actual problem. The whole “turn off your computer” thing was utter nonsense.
I remember hearing all the hype and decided to take my poor 486 running win 3.11 and checked the date in the bios, went further in the future than that pc lasted, set the date to 2000 then booted to windows. the only issue I saw was that the stuff I saved in 2000 had the date 19;0. Fun times
Naturally! I did the same. I was administering a bunch of servers so of course we ignored the hype and just… shocking, I know… tested it.
Except the response to this wasn’t “Let’s figure out a timecode that uses letters instead of numbers and has accuracy down to the femtosecond and enough space for 10^20 years.” IPv6 was massive overkill.
What was it that Bill Gates said as late in the game as the 90s?
“500Kb of memory should be enough for anybody.”
Moore’s Law sucker-punched everybody.
But yeah, they all seemed to be thinking small. And really narrow.