Hugh Nelson, 27, from Bolton, jailed after transforming normal pictures of children into sexual abuse imagery

A man who used AI to create child abuse images using photographs of real children has been sentenced to 18 years in prison.

In the first prosecution of its kind in the UK, Hugh Nelson, 27, from Bolton, was convicted of 16 child sexual abuse offences in August, after an investigation by Greater Manchester police (GMP).

Nelson had used Daz 3D, a computer programme with an AI function, to transform “normal” images of children into sexual abuse imagery, Greater Manchester police said. In some cases, paedophiles had commissioned the images, supplying photographs of children with whom they had contact in real life.

He was also found guilty of encouraging other offenders to commit rape.

  • Dr. Wesker
    link
    fedilink
    English
    8
    edit-2
    27 days ago

    The fuck? Nothing about generating and distributing CSAM material is harmless, and especially if images of real children are being used to generate it.

    • @Mango
      link
      English
      527 days ago

      Okay. Who is harmed and how?

      • @[email protected]
        link
        fedilink
        English
        426 days ago

        Would it harm you to have identifiable nude photos of you available for download on the internet?

        Would it harm you to have identifiable nude photos of you being used to train AI so that it can create more nude images that are “inspired” by your nude images?

        Would you be happy to upload your children’s nude photos so that people on the internet can share them and masturbate to them? Would you be harmed if your parents had done that with your images?

        • @Cryophilia
          link
          English
          225 days ago

          Most AI generated images are not of real, identifiable people. I agree that deepfake porn is bad, whether of a child or adult, but that’s a separate category.

          • @[email protected]
            link
            fedilink
            English
            125 days ago

            You’re definitely right, and I’m aware. The smaller the sample size, though, the more likely an AI art generator would create something that looks very similar to a given individual.

            As well, some AI art generators accept prompt images to use as a starting point.

            • @Cryophilia
              link
              English
              125 days ago

              Ok but that’s a pretty niche thing to be worried about, is my point. You can’t apply that broadly to all AI porn.

        • @Mango
          link
          English
          -126 days ago

          As a child? No. In fact, I can milk that for pity money. As an adult, I can’t see how it matters. I don’t like it, but it doesn’t hurt me any.

          Also definitely no.

          Again, double no.

          • @[email protected]
            link
            fedilink
            English
            3
            edit-2
            26 days ago

            To clarify, the second last question about your children was “would you be happy to …”

            If you wouldn’t be happy to, then why not?

            And if you would be happy to do that, then why? Lol

            • @Mango
              link
              English
              326 days ago

              You got me there. It’s definitely weird and gross and therefore no. That’s harm enough, but that’s more a matter of it being published and real. This dude doing it for himself is hardly different to me from fantasizing in your head or drawing in your sketchbook. That said, what was his AI training material? He’s also doing this for other people and encouraging rape and shit.

              • @[email protected]
                link
                fedilink
                English
                326 days ago

                What makes it different than imagining it or drawing it is that the AI is using real photos as training material. If the parents are knowingly providing images, that’s questionable. If the AI is discovering CSAM images, that’s horrible. If it’s using non-CSAM images of children without the knowing consent of the parents, that’s pretty bad too.

                • @Mango
                  link
                  English
                  126 days ago

                  How is AI using real photos any different from a person using their real memory?

                  • @[email protected]
                    link
                    fedilink
                    English
                    226 days ago

                    Because the AI publishes what it creates based on those images. The AI also doesn’t have imagination the way that a person does. It could accidentally create CSAM material with a child that looks exactly like someone’s child. And it can generate images that look like photos. Someone sketching something from memory can’t do that.