I’m sure I’m not the first to think about this, but I haven’t seen it mentioned yet. I believe there is another problem with open registration instances besides just the threat of spamming.

If you go through the process of making a new post in ANY community and you attach an image and then cancel the post the image is STILL on the server at the given URL and is publically viewable if someone has that URL.

Theoretically, someone could upload illegal images this way and hotlink to them from another site. Because there are no posts on the instance with the bad image attached an instance admin would have no way of knowing the images were there unless they make a habit of browsing the pict-rs datastore regularly. There’s currently no easy way to moderate or delete images in the pict-rs datastore.

I don’t think I need to elaborate more on what kinds of images could be lurking on your very own server which could be hotlinked into VERY dark places on the web. Saying that you “didnt know” that they were there is not a defense. When the authorities are knocking on your door because you are hosting illegal images you will be sorry that you didn’t take a more active role in your user base.

I realize that even if you close or set manual registration that there’s still the danger of a bad user doing this very thing, but I think putting in some minor hurdles would greatly decrease the chances.

Regardless, I think there needs to be a better way to manage the pict-rs part of Lemmy and an easy way for admins and instance owners to be able to view EVERY attachment on their server to make sure there’s nothing there that could get them in trouble.

I run a small instance that just have people I personally know on it, so it’s not a worry for me. But the larger instances that are opening signups to strangers should be aware and take precautions.

  • @nivenkos
    link
    English
    1311 months ago

    How does closed registration stop this? How do you magically detect trolls (or even bots?)?

    The only fix is the ability for admins and moderators to permanently delete content and that shouldn’t be a problem.

    At the end of the day the admins can just SSH into the server and delete it. It’s not on the blockchain.

    • @Aztech
      link
      English
      411 months ago

      Based on what you said, I searched for some tools and found this topic in GitHub: nsfw-detection , it could be a first line of defense.

      • tubbadu
        link
        English
        1
        edit-2
        11 months ago

        deleted by creator