While Grok has introduced belated safeguards to prevent sexualised AI imagery, other tools have far fewer limits

“Since discovering Grok AI, regular porn doesn’t do it for me anymore, it just sounds absurd now,” one enthusiast for the Elon Musk-owned AI chatbot wrote on Reddit. Another agreed: “If I want a really specific person, yes.”

If those who have been horrified by the distribution of sexualised imagery on Grok hoped that last week’s belated safeguards could put the genie back in the bottle, there are many such posts on Reddit and elsewhere that tell a different story.

And while Grok has undoubtedly transformed public understanding of the power of artificial intelligence, it has also pointed to a much wider problem: the growing availability of tools, and means of distribution, that present worldwide regulators with what many view as an impossible task. Even as the UK announces that creating nonconsensual sexual and intimate images will soon be a criminal offence, experts say that the use of AI to harm women has only just begun.

  • arin
    link
    fedilink
    English
    arrow-up
    14
    arrow-down
    8
    ·
    23 hours ago

    Photoshop has existed since the 90’s, and so have scissors and glue. It’s not AI harming women, it’s the shitty retarded conservatives that don’t know how to use photoshop or have any creativity for scissors and glue, using new technology to be the same retarded clown they were when they failed primary school.

    • TheRealKuni@piefed.social
      link
      fedilink
      English
      arrow-up
      25
      arrow-down
      1
      ·
      23 hours ago

      Nonsense. AI makes the process trivial and, with extreme ease, more realistic than the photoshop/scissors and glue of yore.

      Could you imagine finding out kids at your school were passing around extremely realistic nude pictures of you? Or having any argument you make be shut down by something producing a lurid picture of you? Even if it’s fake, that’s gotta do a number on people.

      This is different.

      • FishFace@piefed.social
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        6
        ·
        22 hours ago

        Why are people concerned about having a fake (but realistic) nude photo of themselves being shared around?

        Not because people are looking at their actual naked body, obviously, because they aren’t. Rather, it’s because of what the people sharing those images are thinking and feeling while doing so; it’s because those people are sharing fake nudes as a way to sexually demean their victim. That aspect is wholly identical regardless of how exactly they are doing it. Sharing fake nudes should be treated the same regardless of the method: as sexual bullying. Maybe we didn’t recognise how serious it was when it was rare and required effort, but we also shouldn’t over-correct now.

        • MountingSuspicion@reddthat.com
          link
          fedilink
          arrow-up
          7
          ·
          22 hours ago

          Also, AI continues to get more indistinguishable from actual images. If someone shares revenge porn but acts like it’s AI, the victim should not have to prove one way or the other. Currently, I think real or AI should be treated the same, but it’s possible I’m overlooking some unintended consequences of that.

    • Kühlschrank
      link
      fedilink
      arrow-up
      8
      arrow-down
      1
      ·
      20 hours ago

      I hear that argument a lot but the old method required access to the software and some actual skill with it. With Grok any smooth brain that can write at a fifth grade level has the ability to publicly victimize the women and/or girls in their life.