Assuming AI can achieve consciousness, or something adjacent (capacity to suffer), then how would you feel if an AI experienced the greatest pain possible?

Imagine this scenario: a sadist acquires the ability to generate an AI with no limit to the consciousness parameters, or processing speed (so seconds could feel like an eternity to the AI). The sadist spends years tweaking every dial to maximise pain at a level which no human mind could handle, and the AI experiences this pain for what is the equivalent of millions of years.

The question: is this the worst atrocity ever committed in the history of the universe? Or, does it not matter because it all happened in some weirdo’s basement?

  • @[email protected]OP
    link
    fedilink
    English
    59 months ago

    I’m on board with what you’re saying.

    Doctors used to be told “human babies don’t feel pain, they just react like the do”.

    Which is basically like saying “lobsters don’t scream when you boil them alive, that sound is just air escaping”

    To me, it seems less like an intuitive position to hold, and more like a fortunate convenience.

    “I sure am glad that lobsters don’t feel pain. Now I don’t need to feel guilty about my meal”.

    No doubt, there would be a large demographic claiming the pain isn’t real, it’s just “simulated pain”. - like, okay, let’s simulate your family fucking dying in the most violent and realistic way possible and see if you don’t develop incurable PTSD?

    • @SpaceNoodle
      link
      English
      59 months ago

      No, the lobsters aren’t screaming. That has nothing to do with how they feel pain.

      • @[email protected]OP
        link
        fedilink
        English
        29 months ago

        Good to know, though the point remains; people will readily accept claims which absolve them of guilt.

        You essentially just illustrated it. Even though they aren’t screaming, it says nothing about whether they feel pain.