• @Tattorack
    link
    English
    149 months ago

    The robot dystopia will not be caused by evil AI enslaving humanity.

    No matter how advanced or how self aware, AI will lack the ambition that is part of humanity, part of us due to our evolutionary history.

    An AI will never have an opinion, only logical conclusions and directives that it is required to fulfil as efficiently as possible. The directives, however, are programmed by the humans who control these robots.

    Humans DO have ambitions and opinions, and they have the ability to use AI to enslave other humans. Human history is filled with powerful, ambitious humans enslaving everyone else.

    The robot dystopia is therefor a corporate dystopia.

    I always roll my eyes when people invoke Skynet and Terminator whenever something uncanny is shown off. No, it’s not the machines I’m worried about.

    • @Clubbing4198
      link
      English
      79 months ago

      Have you met people with opinions? A lot of their opinions consist of preprogrammed responses that you could train a bot to regurgitate.

    • @UnderpantsWeevil
      link
      English
      59 months ago

      No matter how advanced or how self aware, AI will lack the ambition that is part of humanity, part of us due to our evolutionary history.

      The ambition isn’t the issue. Its a question of power imbalance.

      The Paperclip Maximizing Algorithm doesn’t have an innate desire to destroy the world, merely a mandate to turn everything into paperclips. And if the algorithm has enough resources at its disposal, it will pursue this quixotic campaign without regard for any kind of long term sensible result.

      The robot dystopia is therefor a corporate dystopia.

      There is some argument that one is a consequence of the other. It is, in some sense, the humans who are being programmed to maximize paperclips. The real Roko’s Basilisk isn’t some sinister robot brain, but a social mythology that leads us to work in the factors that make the paper clips, because we’ve convinced ourselves this will allow us to climb the Paperclip Company Corporate Ladder until we don’t have to make these damned things anymore.

      • @psud
        link
        English
        19 months ago

        Someone screwed up if a paperclip maximiser is given the equipment to take apart worlds, rather than a supply of spring steel

        • @dwalin
          link
          English
          39 months ago

          Thats the beauty of it. The maximizer would understand that creating a machine that breaks appart worlds would maximize the paperclip output. It will be a “natural” progression

    • @A_Very_Big_Fan
      link
      English
      39 months ago

      We’re not even close to artificial general intelligence, so I’d like to see if you have anything to substantiate this claim.

      (Not saying it’s far fetched, though, just that it seems silly to be so sure at this point in time.)