One of the admins at lemmy.blahaj.zone asked us to purge a community and all of its users because they thought it was full of child sexual abuse material, aka CSAM, fka kiddy porn. We assured them that we had checked this comm thoroughly and we were satisfied that all of the models on it were of age.

The admin then demanded we purge the comm because they mistook it for CSAM, and claimed that the entire point of the community was to make people think it was CSAM. We vehemently disagreed that that was in fact the point of the community, but they decided to defederate from us anyway. That is of course their choice, but we will not purge our communities or users because someone else makes a mistake of fact, and then lays the responsibility for their mistake at our feet.

If someone made a community intended to fool people into thinking it was kiddy porn, that would be a real problem. If someone of age goes online and pretends – not roleplays, but pretends with intent to deceive – to be a child and makes porn, that is a real problem. Nobody here is doing that.

One of the reasons we run our instance the way that we do is that we want it to be inclusive. We don’t body shame, and we believe that all adults have a right to sexual expression. That means no adult on our instance is too thin, fat, bald, masculine, old, young, cis, gay, etc., to be sexy, and that includes adults that look younger than some people think they should. Everyone has a right to lust and to be lusted after. There’s no way to draw a line that says “you can’t like adult people that look like X” without crossing a line that we will not cross.

EDIT: OK, closing this post to new comments. Everything that needs saying has been said. Link to my convo with the blahaj admin here.

  • KairuByte
    link
    English
    4
    edit-2
    1 year ago

    deleted by creator

    • @[email protected]
      link
      fedilink
      English
      01 year ago

      I enjoy reading and commenting here, but it is my back of mind fear for federated spaces like Lemmy.

      Bad actors could spam suspicious or actual csam.

      All it takes is one admin/hoster to be “made example of” to really shake the system.

      I hope I’m wrong and ignorant of the realities of the law / prosecution.

      • KairuByte
        link
        English
        101 year ago

        Note: I deleted my comment by mistake. X.x

        So I think most of the time we would be in the clear, as long as actual CSAM is handled when it is found/reported.

        Just like Reddit doesn’t get hauled to court when CSAM is posted. And mods don’t get arrested for viewing it while they are removing it.

    • @[email protected]
      link
      fedilink
      English
      -11 year ago

      I enjoy reading and commenting here, but it is my back of mind fear for federated spaces like Lemmy.

      Bad actors could spam suspicious or actual csam.

      All it takes is one admin/hoster to be “made example of” to really shake the system.

      I hope I’m wrong and ignorant of the realities of the law / prosecution.

      • Mikey Mongol OPM
        link
        fedilink
        English
        111 year ago

        We are extremely aware of this possibility and have taken many active steps against it, and we are scrupulously staying on the right side of US law when it comes to reporting potential CSAM. As stated in our FAQ, preventing CSAM on our instance is our highest priority.

        • @[email protected]
          link
          fedilink
          English
          11 year ago

          Appreciate the verbiage, but I wasn’t calling Lemmy nsfw out, I was commenting on the whole big picture