• @[email protected]
    link
    fedilink
    English
    2
    edit-2
    18 hours ago

    Lol. This comment sent me down a rabbit hole. I still don’t know if it’s logically correct from a non-physicalist POV, but I did come to the conclusion that I lean toward eliminative materialism and illusionism. Now I don’t have to think about consciousness anymore because it’s just a trick our brains play on us (consciousness always seemed poorly defined to me anyways).

    I guess when AI appears to be sufficiently human or animal-like in its cognitive abilities and emotions, I’ll start worrying about its suffering.

    • @[email protected]
      link
      fedilink
      English
      1
      edit-2
      8 hours ago

      If you wanna continue down the rabbit hole, I added some good stuff to my original comment. But if you’re leaning towards epiphenomenalism, might I recommend this one: https://audioboom.com/posts/8389860-71-against-epiphenomenalism

      Edit: I thought of another couple of things for this comment.

      You mentioned consciousness not being well-defined. It actually is, and the go-to definition is from 1974. Nagel’s “What Is It Like to Be a Bat?”

      It’s a pretty easy read, as are all of the essays in his book Mortal Questions, so if you have a mild interest in this stuff you might enjoy that book.

      Very Bad Wizards has at least one episode on it, too. (Link tbd)

      Speaking of Very Bad Wizards, they have an episode about sex robots (link tbd) where (IIRC) they talk about the moral problems with having a convincing human replica that can’t actually consent, and that doesn’t even require bringing consciousness into the argument.