Teen boys use AI to make fake nudes of classmates, sparking police probe::Parents told the high school “believed” the deepfake nudes were deleted.

  • @[email protected]
    link
    fedilink
    English
    261 year ago

    And youre proof that the pedo registry shouldnt exist as is.

    Teenagers being sexually interested in their peers is not pedophilia, and you want to ruin a decade of their life guaranteed, with the “”“”““promise””“”“”" of an expungement that would never actually happen thanks to the permanent nature of the internet for it.

    This misuse of AI is a crime and should be punished and deterred, obviously. But labeling children about to enter the world as pedophiles basically for the rest of their lives?

    Youre kind of a monster.

    • r3df0x ✡️✝☪️
      link
      fedilink
      English
      -11 year ago

      What about the fact that the girls who are victims of something like this will have to contend with the pictures being online if someone posts them there? What if people who don’t know that the pictures depict minors re-post them to other sites, making them very difficult to remove? That can cause very serious employablity problems. It doesn’t matter how open minded people are, they don’t want porn coming up if someone googles one of their employees.

      • @[email protected]
        link
        fedilink
        English
        41 year ago

        The creation is still a crime, no one said otherwise.

        It is just not an act of pedophilia.

    • r3df0x ✡️✝☪️
      link
      fedilink
      English
      -51 year ago

      If you produce CP, you should be on a registry for producing and distributing CP. If you create CP, you are enabling pedophilia.

      • @[email protected]
        link
        fedilink
        English
        61 year ago

        They are children. Being horny about classmates.

        Being sexually aroused by people your own age and wishing to fantasize about it is not enabling pedophilia, you literal psychopath.

        • r3df0x ✡️✝☪️
          link
          fedilink
          English
          -41 year ago

          Circulating porn of minors is a crime and enables pedophiles. Not to mention teenage girls could easily commit suicide over something like this.

          • @Fades
            link
            English
            3
            edit-2
            1 year ago

            So does yearbook and any other kind of photos that depict children for that matter

            You can’t keep pushing the goal posts, by your logic young people should never date or take photos together because it could enable pedophiles somewhere somehow

            These are children with brains still in development, they are discovering themselves and you want to label them forever a pedophile because they didn’t make a conscious effort to research how their spanking material could potentially enable a pedo (because we all know pedos can only be enabled by things produced by kids… yeah that’s the real threat)

            Instead of suggesting a way to help the victims you are advocating for the creation of yet more victims

            What a pathetic brain dead stance you are defending

            • @eatthecake
              link
              English
              21 year ago

              Abuse and bullying of their classmates is just ‘discovering themselves’? Discovering that they’re psychopathic little mysoginists I guess. Their ‘spanking material’ was created in order to demean and huumiliate their victims. There’s plenty of porn online and absolutely no need for them to do this. If you actuslly wanted to help the victims you would not be trivialising and excusing this behaviour as ‘being horny about classmates’.

              • @[email protected]
                link
                fedilink
                English
                11 year ago

                And an AI image with a face photoshopped over it isnt a photo of a child.

                And a teen being sexually interested in other teens isnt a pedophile.