Edit - This is a post to the meta group of Blåhaj Lemmy. It is not intended for the entire lemmyverse. If you are not on Blåhaj Lemmy and plan on dropping in to offer your opinion on how we are doing things in a way you don’t agree with, your post will be removed.

==

A user on our instance reported a post on lemmynsfw as CSAM. Upon seeing the post, I looked at the community it was part of, and immediately purged all traces of that community from our instance.

I approached the admins of lemmynsfw and they assured me that the models with content in the community were all verified as being over 18. The fact that the community is explicitly focused on making the models appear as if they’re not 18 was fine with them. The fact that both myself and one a member of this instance assumed it was CSAM, was fine with them. I was in fact told that I was body shaming.

I’m sorry for the lack of warning, but a community skirting the line trying to look like CSAM isn’t a line I’m willing to walk. I have defederated lemmynsfw and won’t be reinstating it whilst that community is active.

  • @Alpharius
    link
    English
    161 year ago

    I get the feeling it’s harder to moderate the nsfw content that is posted in real time across multiple instances and even more communities. Anyone could poison the well with heinous content and it would take a moderator of that specific instance/community to remove that content, rather than having centralized moderators for illegal/deplorable content.

    • @[email protected]
      link
      fedilink
      English
      201 year ago

      I’m just imagining the liability of nsfw content. Honestly think it’s an excellent idea ada defederated, I don’t think they’d want the legal risk. So many laws can be broken just by neglect. Revenge porn laws, depiction of actual SA, underage content slipping through, etc etc.

      • @EatMyDick
        link
        English
        41 year ago

        I was laughed out of the room for pointing this out the other day, and I still get laughed out of the room pointing out the massive privacy concerns and liability everyone is setting themselves up for.

        Someone is going to get sued for children’s data and GDPR.