They expel high school students over that? I mean sure, it’s not good. But I believe that since this stuff is so new, teens don’t have a feeling for right and wrong there yet. It’s also different to taking actual nude pictures of people. I would expect schools to give people a talk about this, and maybe educate everyone during sex education about this stuff.
This comment has a strong “boys will be boys” vibe. Creating and sharing such images can be completely devastating and life changing in a negative way for the victims.
Ok, let’s pump the brakes for a sec - I need to clarify something:
Is it your position that a high schooler using an AI deepfake generator to create a video of a female classmate in what could only be described as hardcore pornography is in any way, shape, or form excusable, defensible, or even slightly morally ambiguous?
The question I have is would they have been arrested, expelled or suspended if there was no AI stuff involved. If they had drawn, painted, 3D rendered, or physically cut and paste pictures would the punishment been as severe? At the very least I would think it’s still harassment and bullying, I’m just trying to figure out if the punishments are getting harsher because the fakes are more realistic.
Kid is lucky he’s not going to juvenile detention, making and sharing child porn is kinda an extremely serious crime, even if you’re in high school and the cp is ai generated
calling out that I’ve read pedos on this site advocating for AI generated content. This is problematic in a lot of ways. Normalizing ai cp, of people you know or not is nasty as fuck.
This highschool stuff is the next level of problematic because it is based on people in the local system, and is often weaponized.
While I was initially inclined to agree with you on the argument of “where’s the law, where’s the line” the article is pretty clear there is a law for it where they live.
On the one hand, yeah, I generally agree that children shouldn’t be arrested for something they’re doing just goofing around (to them it seems like a victimless crime), but on the other hand, it’s a sexual crime against children which I firmly believe should have zero tolerance.
AI seems to be getting the same “what can you do about it” privilege as guns, which should be the focus.
They expel high school students over that? I mean sure, it’s not good. But I believe that since this stuff is so new, teens don’t have a feeling for right and wrong there yet. It’s also different to taking actual nude pictures of people. I would expect schools to give people a talk about this, and maybe educate everyone during sex education about this stuff.
“…educate everyone during sex education…”
In Florida?
This comment has a strong “boys will be boys” vibe. Creating and sharing such images can be completely devastating and life changing in a negative way for the victims.
Ok, let’s pump the brakes for a sec - I need to clarify something:
Is it your position that a high schooler using an AI deepfake generator to create a video of a female classmate in what could only be described as hardcore pornography is in any way, shape, or form excusable, defensible, or even slightly morally ambiguous?
Unequivocally no, but the punishment should fit the crime.
The question I have is would they have been arrested, expelled or suspended if there was no AI stuff involved. If they had drawn, painted, 3D rendered, or physically cut and paste pictures would the punishment been as severe? At the very least I would think it’s still harassment and bullying, I’m just trying to figure out if the punishments are getting harsher because the fakes are more realistic.
Or perhaps we’re only hearing about the AI stuff
Kid is lucky he’s not going to juvenile detention, making and sharing child porn is kinda an extremely serious crime, even if you’re in high school and the cp is ai generated
calling out that I’ve read pedos on this site advocating for AI generated content. This is problematic in a lot of ways. Normalizing ai cp, of people you know or not is nasty as fuck.
This highschool stuff is the next level of problematic because it is based on people in the local system, and is often weaponized.
No, it’s pretty clearly wrong.
Shut the fuck up. These are hs kids they know better.
This asshole is the equivalent of the “well what was she wearing?” guy every time trump raped someone.
While I was initially inclined to agree with you on the argument of “where’s the law, where’s the line” the article is pretty clear there is a law for it where they live.
On the one hand, yeah, I generally agree that children shouldn’t be arrested for something they’re doing just goofing around (to them it seems like a victimless crime), but on the other hand, it’s a sexual crime against children which I firmly believe should have zero tolerance.
AI seems to be getting the same “what can you do about it” privilege as guns, which should be the focus.