Long but well written article. It’s hard to disagree with any of the specific points. Warning that it’s pretty long, and reads like a sci-fi novel.

Curious for opinions. This seems alarming? But also doomsday predictions tend to be wrong.

  • paraphrand
    link
    fedilink
    English
    arrow-up
    2
    ·
    16 hours ago

    What’s interesting, to me, is that’s exactly how people hedge in the fringe UFO community too. The difference is this AI prediction has real data woven into it. And the UFO people have “stories” and right wing grifters in our current administration pushing the idea.

    I’m not trying to be dismissive, I just noticed that the year 2027 is popular for predictions, and has been for 4-5 years.

    Oh, and I forgot to list fusion power being cracked. I’ll go add it. There are a bunch of startups right now. Sam Altman just stepped down from the board of one of them to address conflict of interest issues. Allegedly.

    • postscarce@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      16 hours ago

      What’s interesting, to me, is that’s exactly how people hedge in the fringe UFO community too.

      Ha! True. Very true. I find this scenario compelling but it’s based on a series of assumptions which individually seem plausible but I have no way to evaluate them all together. It’s like the Drake Equation; because the probabilities are multiplicative even tiny adjustments to a few of them end up making a huge difference to the final answer.

      The thing is though, if there really is even a tiny chance of the ultimate outcome of this thought experiment being true (i.e. the end of humanity) then we should probably address it. And what that would look like is stopping the AI companies from doing any more research until they can prove their model will be safe, which should make people who are more concerned about AI slop happy too. Everybody wins by hitting the brakes. (Edit: well, Sam Altman doesn’t but I’m not going to lose sleep over that.)