Taylor Swift is living every woman’s AI porn nightmare — Deepfake nudes of the pop star are appearing all over social media. We all saw this coming.::Deepfake nudes of the pop star are appearing all over social media. We all saw this coming.

  • @[email protected]
    link
    fedilink
    English
    -4
    edit-2
    11 months ago

    So she’s less a victim because she’s wealthy? My god you people can justify anything, can’t you?

    • @Tangent5280
      link
      English
      811 months ago

      That is exactly it. She will suffer less compared to someone else this might have happened to, an dif you define victimhood on a spectrum, she’s less victim than Housewife Community leader preschool teacher Margaret from Montana.

      • @[email protected]
        link
        fedilink
        English
        -911 months ago

        Gross dude. Very gross. Blocking you now as someone who thinks the wealthy can’t be victimized can’t possibly have anything of value to contribute.

        Do better.

        • @[email protected]
          link
          fedilink
          English
          211 months ago

          The guy said less victimized and you conclude he meant cannot be victimized. Can you be any more stupid?

    • @MJKee9
      link
      English
      711 months ago

      You just keep shifting your argument to create some sort of sympathy. I guess. No one says a rich person isn’t a victim. The point is is being a victim as a wealthy and influential woman like Taylor is a lot different than being a victim in a working class context. If you disagree with that, then you’re either being intellectually dishonest or living in a dream world.

      Even the law agrees. It’s a lot harder as a celebrity to win a defamation lawsuit than it is being a normal person. You typically have to show actual malice. Frankly, that’s the legal standard that would probably apply to any lawsuit involving the deep fakes anyway.

        • @MJKee9
          link
          English
          011 months ago

          It’s not a crime.

          • @[email protected]
            link
            fedilink
            English
            111 months ago

            So, creating nude AI deepfakes isn’t a crime? Then there’s no victims at all. What’s everyone talking about then?

            • @MJKee9
              link
              English
              111 months ago

              It can’t be a crime unless there is a criminal statute that applies. See if you can find one thst applies.

                • @MJKee9
                  link
                  English
                  011 months ago

                  Your response doesn’t logically respond to my comment. It attempts to reframe the argument by setting up a “strawman,” and shows that you fail to understand (or choosing to ignore because it doesn’t support your new reframed argument) the difference between civil and criminal law in the United States.