- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
New Mexico is seeking an injunction to permanently block Snap from practices allegedly harming kids. That includes a halt on advertising Snapchat as “more private” or “less permanent” due to the alleged “core design problem” and “inherent danger” of Snap’s disappearing messages. The state’s complaint noted that the FBI has said that “Snapchat is the preferred app by criminals because its design features provide a false sense of security to the victim that their photos will disappear and not be screenshotted.”
I believe that cartoon images depicting sex of underage kids is still illegal. At least in the US.
Feel free to correct me if I am wrong but seems like I remember this from a news article a while back. Maybe it was just a specific state.
I am not going to Google that one though to find out though.
https://cellebrite.com/en/ai-and-csam-a-look-at-real-cases/
Best I could find about this.
Imo as long as the ai was not trained on actual CSAM and the product is not depicting real people, then it shouldn’t be illegal as it is not hurting anyone which is why we have laws against CSAM in the first place.
What if it normalises CSAM and some people don’t discerne between real and AI?
And what if video games, movies, and books normalize killing? There is no evidence to show that it does or that it will.
While I’m not going to have this specific topic in my search history, sexually violent porn very likely does nothing to encourage actual sexual violence. Most studies show that it has no effect on sexual violence at all, some show it decreases it, and only a few studies show it increases it (and those ones tend to have smarter people than me saying they are flawed).
While media can have psychological effects, normalizing extreme behavior doesn’t seem to be one of them. That said, I wouldn’t trust an ai bro or their ai to handle something like that. At best they don’t know what goes into their training sets, at worst they would probably deliberately include csam.
We have porn games, but we don’t have CP games. There’s a line between violence and SA with minors.
Edit: oh wait, Japan might be an example 🙃 and yeah, they got issues.
I think the bar is whether it could be reasonably mistaken for a real child. Which makes quite a lot of disgusting content legal.
I also find it to be repugnant, but if the images are not based on real people and the ai was not trained on real csam(good luck proving this either way), then it shouldn’t be illegal. The laws were made to protect kids, and drawings of purly fictional characters are not hurting the kids.
Pretty much every law ever made in the history of humanity that was ostensibly to protect children is actually about control of the population.
This is just plain wrong.
Obviously, there are loads of laws and very good legislation that does indeed protect children.
Just one example: child labour laws.
I suspect that what you really mean is that whenever a politician says whatever police powers are required to protect children, they really just want more power to violate privacy to make it easier to prosecute various crimes.
Exception that proves the rule.
What about child support paid by parents who are separated?
What about welfare laws ensuring a minimum standard of care for children?
What about social security for families?
What about minimum age of consent?
What about this one particular gain of sand I found that’s blue? Look, here’s another and another. Clearly, all sand is blue and beaches are blue. Don’t argue, or I’ll show you the 6 grains of sand I found.
What a silly thing to say.
You’ve made an assertion, I’ve provided examples to the contrary, and the best you’ve got is a grain of sand metaphor?
Obviously, it depends how many laws purported to protect children actually do. The examples I’ve provided form the bedrock of the modern family structure. They’re not insignificant grains of sand.
Yeah don’t Google it hahaha
It what makes it a child? There’s some creepy anime girls who definitely fall into that questionable category. And if I label a stick figure with an age… does that make it illegal? What about an ai image with bubble text that says “I’m not real. I’m 18, I have a magical curse on me etc etc” now it’s fiction?
Since it isn’t actually real… what is the line, and how can that line be measured? Since this is just going to keep being a problem, this awkward conversation needs to happen in a logical, calm manner.
https://cellebrite.com/en/ai-and-csam-a-look-at-real-cases/
Best I could find about this.
Imo as long as the ai was not trained on actual CSAM and the product is not depicting real people, then it shouldn’t be illegal as it is not hurting anyone which is why we have laws against CSAM in the first place.
I definitely don’t want to sound as if I’m promoting this material, but I agree. Fake things are fake and real things are real. Yeah, it makes a lot of people uncomfortable to think about it and I totally understand.
Fake images of murder seem to be perfectly fine! And that’s arguably the worst crime possible. We show that shit to our kids.
Isn’t it sad that we even have to say we don’t promote it before we say anything else?
Yeah. But you know how it is here, you’re either against it or you’re one of them. Make a logical comparison between two nearly identical things and you’re whatabouting. I appreciate you recognizing the difference.
Probably not as bad on lemmy, at least.