Taylor Swift is living every woman’s AI porn nightmare — Deepfake nudes of the pop star are appearing all over social media. We all saw this coming.::Deepfake nudes of the pop star are appearing all over social media. We all saw this coming.

  • @Coreidan
    link
    English
    -411 months ago

    Ya yuge nightmare being a billionaire must be. Poor rich person

    • @[email protected]
      link
      fedilink
      English
      -511 months ago

      So you’re saying that the more wealth a person has, the more they deserve crimes against them? Come one know kid. Do you really want to think this way?

      • @MJKee9
        link
        English
        9
        edit-2
        11 months ago

        That’s not their point and you know it. Get your bad faith debating tactics out of here.

        She isn’t living “every woman’s nightmare” because a woman without the wealth and influence Taylor has might actually suffer significant consequences. For Taylor, it’s just a weird Tuesday. For an average small town lady, it might mean loss of a job, loss of mate, estrangement from family and friends… That’s a nightmare.

        • @[email protected]
          link
          fedilink
          English
          -4
          edit-2
          11 months ago

          So she’s less a victim because she’s wealthy? My god you people can justify anything, can’t you?

          • @Tangent5280
            link
            English
            811 months ago

            That is exactly it. She will suffer less compared to someone else this might have happened to, an dif you define victimhood on a spectrum, she’s less victim than Housewife Community leader preschool teacher Margaret from Montana.

            • @[email protected]
              link
              fedilink
              English
              -911 months ago

              Gross dude. Very gross. Blocking you now as someone who thinks the wealthy can’t be victimized can’t possibly have anything of value to contribute.

              Do better.

              • @[email protected]
                link
                fedilink
                English
                211 months ago

                The guy said less victimized and you conclude he meant cannot be victimized. Can you be any more stupid?

          • @MJKee9
            link
            English
            711 months ago

            You just keep shifting your argument to create some sort of sympathy. I guess. No one says a rich person isn’t a victim. The point is is being a victim as a wealthy and influential woman like Taylor is a lot different than being a victim in a working class context. If you disagree with that, then you’re either being intellectually dishonest or living in a dream world.

            Even the law agrees. It’s a lot harder as a celebrity to win a defamation lawsuit than it is being a normal person. You typically have to show actual malice. Frankly, that’s the legal standard that would probably apply to any lawsuit involving the deep fakes anyway.

              • @MJKee9
                link
                English
                011 months ago

                It’s not a crime.

                • @[email protected]
                  link
                  fedilink
                  English
                  111 months ago

                  So, creating nude AI deepfakes isn’t a crime? Then there’s no victims at all. What’s everyone talking about then?

                  • @MJKee9
                    link
                    English
                    111 months ago

                    It can’t be a crime unless there is a criminal statute that applies. See if you can find one thst applies.