Hugh Nelson, 27, from Bolton, jailed after transforming normal pictures of children into sexual abuse imagery

A man who used AI to create child abuse images using photographs of real children has been sentenced to 18 years in prison.

In the first prosecution of its kind in the UK, Hugh Nelson, 27, from Bolton, was convicted of 16 child sexual abuse offences in August, after an investigation by Greater Manchester police (GMP).

Nelson had used Daz 3D, a computer programme with an AI function, to transform “normal” images of children into sexual abuse imagery, Greater Manchester police said. In some cases, paedophiles had commissioned the images, supplying photographs of children with whom they had contact in real life.

He was also found guilty of encouraging other offenders to commit rape.

  • @Mango
    link
    English
    -126 days ago

    As a child? No. In fact, I can milk that for pity money. As an adult, I can’t see how it matters. I don’t like it, but it doesn’t hurt me any.

    Also definitely no.

    Again, double no.

    • @[email protected]
      link
      fedilink
      English
      3
      edit-2
      26 days ago

      To clarify, the second last question about your children was “would you be happy to …”

      If you wouldn’t be happy to, then why not?

      And if you would be happy to do that, then why? Lol

      • @Mango
        link
        English
        326 days ago

        You got me there. It’s definitely weird and gross and therefore no. That’s harm enough, but that’s more a matter of it being published and real. This dude doing it for himself is hardly different to me from fantasizing in your head or drawing in your sketchbook. That said, what was his AI training material? He’s also doing this for other people and encouraging rape and shit.

        • @[email protected]
          link
          fedilink
          English
          326 days ago

          What makes it different than imagining it or drawing it is that the AI is using real photos as training material. If the parents are knowingly providing images, that’s questionable. If the AI is discovering CSAM images, that’s horrible. If it’s using non-CSAM images of children without the knowing consent of the parents, that’s pretty bad too.

          • @Mango
            link
            English
            126 days ago

            How is AI using real photos any different from a person using their real memory?

            • @[email protected]
              link
              fedilink
              English
              226 days ago

              Because the AI publishes what it creates based on those images. The AI also doesn’t have imagination the way that a person does. It could accidentally create CSAM material with a child that looks exactly like someone’s child. And it can generate images that look like photos. Someone sketching something from memory can’t do that.

              • @Mango
                link
                English
                -126 days ago

                AI doesn’t have to publish, and also that doesn’t make it any different from drawing. I don’t think the CP is accidental. Someone with enough skill can absolutely do that.

                • @[email protected]
                  link
                  fedilink
                  English
                  2
                  edit-2
                  25 days ago

                  Sorry, I meant it could create CSAM that, by accident, looks exactly like one of the source children.

                  AI “publishes” whenever it gives something to the user.

                  Drawing is different from AI art because AI art can look like photographs.

                  • @Mango
                    link
                    English
                    -125 days ago

                    Drawing can look like photographs. How old are you?