• Rekall Incorporated@lemm.eeOP
    link
    fedilink
    English
    arrow-up
    0
    ·
    21 days ago

    I don’t have any stats to back this up, but I wouldn’t be surprised if failure rates were higher back in the 90s and 2000s.

    We have much more sophisticated validation technologies and the benefit of industry, process and operational maturity.

    Would be interesting to actually analyze the real world dynamics around this.

    • GrindingGears@lemmy.ca
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      1
      ·
      edit-2
      20 days ago

      Not very many people had a dedicated GPU in the 90s and 2000s. And there’s no way the failure rate was higher, not even Limewire could melt down the family PC back then. It sure gave it the college try, but it was usually fixable. The biggest failures, bar none, were HD or media drives.