• @[email protected]
    link
    fedilink
    English
    110 months ago

    either it’s true that you can write an axiom that says “sentient beings should always consent to anything that is done to them” or you can write an axiom that says “you should always do what will bring about the most happiness or at least distress”

    those axioms are in conflict with one another. it’s not that there’s only bad choices. it’s that you’ve given yourself conflicting standards.

    • @[email protected]
      link
      fedilink
      110 months ago

      Neither of those are axioms I hold. The axiom “all sentient beings are morally relevant” does not specify how to go from there, and I am not convinced that any one ethical framework is “the one”. There are some things that all the ones I’m aware of converge on with a sentientist perspective, but there are weird cases as well like whether to euthanize stray animals where they don’t converge

        • @[email protected]
          link
          fedilink
          110 months ago

          https://lemmy.antemeridiem.xyz/comment/2243561 I haven’t put my views in those terms before but even here I say my views are based on sentience. I give an example, and I should have been more clear that I’m not strictly looking at the issue from a utilitarian lense although I get why it would come across that way. At base I’m a sentientist, I think there are many reasonable ways to go from there