Hello everyone,

We unfortunately have to close the !lemmyshitpost community for the time being. We have been fighting the CSAM (Child Sexual Assault Material) posts all day but there is nothing we can do because they will just post from another instance since we changed our registration policy.

We keep working on a solution, we have a few things in the works but that won’t help us now.

Thank you for your understanding and apologies to our users, moderators and admins of other instances who had to deal with this.

Edit: @[email protected] the moderator of the affected community made a post apologizing for what happened. But this could not be stopped even with 10 moderators. And if it wasn’t his community it would have been another one. And it is clear this could happen on any instance.

But we will not give up. We are lucky to have a very dedicated team and we can hopefully make an announcement about what’s next very soon.

Edit 2: removed that bit about the moderator tools. That came out a bit harsher than how we meant it. It’s been a long day and having to deal with this kind of stuff got some of us a bit salty to say the least. Remember we also had to deal with people posting scat not too long ago so this isn’t the first time we felt helpless. Anyway, I hope we can announce something more positive soon.

  • Cosmic Cleric
    link
    English
    41 year ago

    You misunderstood what I meant by the part that you highlighted of my comment.

    I’m speaking of Safe Harbor provisions, not having to take active DCMA actions. They’re two very different things.

    • 𝕯𝖎𝖕𝖘𝖍𝖎𝖙
      link
      11 year ago

      Yes, and believe it or not, I’ve been discussing both with people.

      I use DCMA actions because they are easily understood. People get copyright strikes. People pirate music.

      Safe Harbor provisions are not as easily understood, but basically amount to (IANAL) “if the administrator removes the offending content in a reasonable amount of time when they learn about the offending content, then we’re all good”. It’s not a safe haven for illicit content, it’s more of a “well, you didn’t know so we can’t really fault you for it” sort of deal. But when admins know about the content, they need to take action.