I don’t think that number means anything. The way it’s phrased it’s 80% of some unknown percentage of 3000 users, so could be 5 people responded, one that is an actual pedo said ‘yes’, one that is an actual pedo said ‘no’, and the rest just said ‘yes’ to be edgy.
“In a dark web forum” - presumably not a forum about baking, manufacturing meth or pirating movies. In other words, they probably created the poll in a forum at least related to that kind of material/activity.
Two officials from the US Justice Department’s Child Exploitation and Obscenity Section told The Washington Post that AI-generated images depicting “minors engaged in sexually explicit conduct” are illegal under at least two US laws.
One law “makes it illegal for any person to knowingly produce, distribute, receive, or possess with intent to transfer or distribute visual representations, such as drawings, cartoons, or paintings that appear to depict minors engaged in sexually explicit conduct and are deemed obscene.” The other law “defines child pornography as any visual depiction of sexually explicit conduct involving a minor,” including “computer-generated images indistinguishable from an actual minor.”
Similar laws have been struck down by the Supreme Court in the past under the argument that if no children are being harmed (ie, these aren’t pictures of actual children), then there is no basis for the government to restrict creation and possession of the images.
Removed by mod
I don’t think that number means anything. The way it’s phrased it’s 80% of some unknown percentage of 3000 users, so could be 5 people responded, one that is an actual pedo said ‘yes’, one that is an actual pedo said ‘no’, and the rest just said ‘yes’ to be edgy.
“In a dark web forum” - presumably not a forum about baking, manufacturing meth or pirating movies. In other words, they probably created the poll in a forum at least related to that kind of material/activity.
There are laws against it in the US.
From the article:
Similar laws have been struck down by the Supreme Court in the past under the argument that if no children are being harmed (ie, these aren’t pictures of actual children), then there is no basis for the government to restrict creation and possession of the images.
Right. Except these models were trained on something.
Thats pretty fucking dumb considering it normalizes the idea of sexualizing children. Are policymakers really oblivious to how that will go?
Does video game violence normalize regular violence? Are people playing violent video games going out and harming people.
Can’t believe this argument is still being used in 2023.