- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
A Florida man is facing 20 counts of obscenity for allegedly creating and distributing AI-generated child pornography, highlighting the danger and ubiquity of generative AI being used for nefarious reasons.
Phillip Michael McCorkle was arrested last week while he was working at a movie theater in Vero Beach, Florida, according to TV station CBS 12 News. A crew from the TV station captured the arrest, which made for dramatic video footage due to law enforcement leading away the uniform-wearing McCorkle from the theater in handcuffs.
It’s not CSAM in the training dataset, it’s just pictures of children/people that are already publicly available. This goes on to the copyright side of things of AI instead of illegal training material.
It’s images of children used to make CSAM. No matter of your mental gymnastics can change that, nor the fact that those children’s consent was not obtained.
Why are you trying so hard to rationalize the creation of CSAM? Do you actually believe there is a context in which CSAM is OK? Are you that sick and perverted?
Because it really sounds like that’s what you’re trying to say, using copyright law as an excuse.
It’s every time with you people, you can’t have a discussion without accusing someone of being a pedo. If that’s your go-to that says a lot about how weak your argument is or what your motivations are.
It’s hard to believe someone is not a pedo when they advocate so strongly for child porn
You’re just projecting your unwillingness to ever take a stance that doesn’t personally benefit you.
Some people can think about things objectively and draw a conclusion that makes sense to them without personal benefit being a primary determinant of said conclusion.
I’m not the one here defending child porn
You’re arguing against a victimless outlet that there is significant evidence would reduce the incidence of actual child molestation.
So let’s use your ‘logic’/argumentation: why are you against reducing child molestation? Why are you against fake pictures but not actual child molestation? Why do you want children to be molested?
Your claim that it’s victimless is, of course, false since real children are used in the training data without consent. This also ignores the fact that the result is child porn, for which you are arguing in support of.
Lastly, your claim that any of this results in any reduction in child abuse is spurious and unsubstantiated.
Your assumption, but there are a ton of royalty-free images that contain children out there, more than enough for an AI to ‘learn’ proportions etc. Combine with adult nudity, and a generative AI can ‘bridge the gap’ create images of people that don’t exist (hence the word “generative”).
That’s not a fact. “Child porn” requires a child–pixels on a screen depicting the likeness of a person, and a person that does not actually exist in the real world to boot, is not a child.
I’m just making a reasonable guess based on what’s been found about other things in the same subcategory (Japanese research found that those who have actually molested a kid were less likely to have consumed porn comics depicting that subject matter, than the general population), and in other sex categories, like how the prevalence of rape fantasy porn online correlates with a massive reduction of real-life rape.
Seems pretty unlikely that this is going to be the one and only exception to date where a fictional facsimile doesn’t ‘satiate’ the urge to offend in real life, and instead encourages the ‘consumer’ to offend.
its hard to argue with someone who believes the use of legal data to create more data is ever illegal.
Child porn isn’t legal
Lol you don’t understand that the faces AI generated are not real. In any way.
I am not trying to rationalize it, I literally just said I was neutral.
How are you neutral about child porn? The vast majority of everyone on this planet is very much against it.
I’m not neutral about child porn, I’m very much against it, stop trying to put words in my mouth. I’m talking about this kind of use of AI could be in the very same category of loli imagery, since these are not real child sexual abuse material.
Then why are you defending it?
And I do not believe there is a context where CSAM is okay, I never said that. And me being neutral on this topic doesn’t make me sick and perverted.
It’s hard to believe you’re not a pedophile when you advocate so strongly for child porn.