- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
Hugh Nelson, 27, from Bolton, jailed after transforming normal pictures of children into sexual abuse imagery
A man who used AI to create child abuse images using photographs of real children has been sentenced to 18 years in prison.
In the first prosecution of its kind in the UK, Hugh Nelson, 27, from Bolton, was convicted of 16 child sexual abuse offences in August, after an investigation by Greater Manchester police (GMP).
Nelson had used Daz 3D, a computer programme with an AI function, to transform “normal” images of children into sexual abuse imagery, Greater Manchester police said. In some cases, paedophiles had commissioned the images, supplying photographs of children with whom they had contact in real life.
He was also found guilty of encouraging other offenders to commit rape.
Sorry, I meant it could create CSAM that, by accident, looks exactly like one of the source children.
AI “publishes” whenever it gives something to the user.
Drawing is different from AI art because AI art can look like photographs.
Drawing can look like photographs. How old are you?
Lol…do you really not see the difference in an AI art generator that can produce realistic CSAM in seconds, and a talented artist who can draw CSAM so realistic that it looks like a photograph?
No I don’t. There’s no difference. Are you trying to say that talent gives you a free pass where otherwise they shouldn’t? Fuck that. The speed is meaningless. The realism is meaningless. The brush you paint with doesn’t change the ethics even a little bit.
It’s not about the speed in isolation. The speed is what allows for the quantity to be much greater.
Just like breaking into one car over night is bad, but breaking into 100,000 cars over one night is a problem of a much greater scope.
So your point is that because he’s fast with this tool, it’s bad? Guess we gotta institute fake CP data rate limits.
A tool that allows anyone to generate countless images of CSAM in minutes (based on real images as input) is definitely worse than someone needing to spend years honing an art and using hours to produce one image of CSAM. I’m not really sure how someone could argue against that.
Why? It’s pictures. Sticks and stones yo.
So again…you wouldn’t do it with your children’s pictures, right?