• Rekall IncorporatedOP
    link
    fedilink
    English
    116 days ago

    I don’t have any stats to back this up, but I wouldn’t be surprised if failure rates were higher back in the 90s and 2000s.

    We have much more sophisticated validation technologies and the benefit of industry, process and operational maturity.

    Would be interesting to actually analyze the real world dynamics around this.

    • @[email protected]
      link
      fedilink
      English
      1
      edit-2
      5 days ago

      Not very many people had a dedicated GPU in the 90s and 2000s. And there’s no way the failure rate was higher, not even Limewire could melt down the family PC back then. It sure gave it the college try, but it was usually fixable. The biggest failures, bar none, were HD or media drives.

      • @[email protected]
        link
        fedilink
        English
        25 days ago

        Dedicated GPUs were pretty common in the 2000s, they were required for most games, unlike the 90s where it was an unstandardized wild west. The failure rate had to be higher, I know I had 3 cards die with less than 2 years use on each card in the 2000s. Cases back then had terrible airflow and graphic demands jumped quickly.

      • @Jimmycakes
        link
        English
        25 days ago

        We all did they used to cost like 60 bucks

    • tehWrapper
      link
      English
      16 days ago

      I am going to guess the amount made is also much higher than 90s and 2000s since hardware tech is way more popular and used in way more places in the world. So maybe a lower percent but just a high total amount.

      But I have no idea…