Lemmy.world is temporarily disabling open signups and moving to an application-required signup process, due to ongoing issues with malicious bot accounts.
We know this is a major step to take, but we believe that it’s the right one for both us and our community right now.
We’re working on a better long-term technical solution to these bots, but that will take time to create, test, and verify that it doesn’t cause any problems with federation and how our users use our site, and we’d rather make sure we get it right than have a site that’s broken.
We’re making this change on 28 Aug 2023, and don’t have a specific timeline for how long registrations will require an application, but we will post an update once our new anti-abuse measures are in place and working.
Take care, LW Team
Removed by mod
I’m guessing they’re not even flagging that shit as NSFW? I’ve been using liftoff and have the NSFW stuff hidden. I haven’t run into of it yet but that’s fucked up, hopefully it gets under control with this.
Maybe mods of each section can turn on manual approvals of submissions?
Removed by mod
Isn’t there a tool (possible free) by Google I think that detects abusive material like this?
https://protectingchildren.google/intl/en_uk/#introduction
Removed by mod
deleted by creator
I agree, everything on Lemmy is public for all to see, that’s the nature of the Fediverse. Nothing here is really private, even vote counts since Admins of any self hosted server can see them, or Kbin which reveals them publicly for all.
Even DMs don’t have it, which is why it nags you to use Matrix for secure DMs.
To combat this until there is something in place to automate blocking it. Manually approval might just be the only way to deal with it for now. Places can add more moderators.
Manual approval would mean that mods have to see all that shit to block it… That’s not the right solution imo
They’ll end up having to see it anyways to remove it, and by that point more than just the mods would have seen it…