Edit - This is a post to the meta group of Blåhaj Lemmy. It is not intended for the entire lemmyverse. If you are not on Blåhaj Lemmy and plan on dropping in to offer your opinion on how we are doing things in a way you don’t agree with, your post will be removed.
==
A user on our instance reported a post on lemmynsfw as CSAM. Upon seeing the post, I looked at the community it was part of, and immediately purged all traces of that community from our instance.
I approached the admins of lemmynsfw and they assured me that the models with content in the community were all verified as being over 18. The fact that the community is explicitly focused on making the models appear as if they’re not 18 was fine with them. The fact that both myself and one a member of this instance assumed it was CSAM, was fine with them. I was in fact told that I was body shaming.
I’m sorry for the lack of warning, but a community skirting the line trying to look like CSAM isn’t a line I’m willing to walk. I have defederated lemmynsfw and won’t be reinstating it whilst that community is active.
Context always matters. I always check if adult material has actually been made by consenting adults. I would feel sick, if not enough information had been provided for that, but I at least have never encountered CSAM fortunately.
I had classmates in high school with balding or even graying hair and full beards. Some adults older than me, look younger than my nephews. Revenge porn and creepshots are common. (or atleast were, I’m not on platforms where these are popular)
Without context, porn will always be a morally grey area. Even commercialized hyper-capitalist porn is still an intimate affair.
That’s why I didn’t use pornhub for example, before every user had to verify themselves before posting. Before that I only read erotica or looked at suggestive drawings.
I understand your perspective tho. You get hardly paid to keep this instance running, looking at pictures that without context could be CSAM could make this volunteer work very mentally taxing. This is how NSFW works tho.
Without context, any pornographic material featuring real humans could in truth be some piece of evidence for a horrible crime.
If I can’t tell, if I have to look something up because the people I’m looking at look like they’re underage, then it doesn’t matter what the answer is, because the issue is that it looks like CSAM even if it’s not. And a community designed in a way that attracts people looking for underage content is not a space I’m willing to federate with.
Removed by mod
I’ve covered this many times already.
The issue isn’t individuals that happen to look younger than they are. The issue is with a community gathering sexual content of people that appear to be children.
The community that initiated this isn’t even the worst offender on lemmynsfw. There is at least one other that is explicitly focused on this.
Removed by mod
I personally find the subs a little weird, but they have rules explicitly stating that no one is to be under 18, and as others have said it’s all clearly professionally taken watermarked photos.
Also, If the initial picture in question was a verified adult that posted themselves voluntarily this all seems like a a huge overreaction.
IMO it’s pretty clear the lemmynsfw mods are doing a lot to remove any actual cp, they banned loli and Shota and other types of content to avoid any type of legal or cp issues.