I’m sure I’m not the first to think about this, but I haven’t seen it mentioned yet. I believe there is another problem with open registration instances besides just the threat of spamming.

If you go through the process of making a new post in ANY community and you attach an image and then cancel the post the image is STILL on the server at the given URL and is publically viewable if someone has that URL.

Theoretically, someone could upload illegal images this way and hotlink to them from another site. Because there are no posts on the instance with the bad image attached an instance admin would have no way of knowing the images were there unless they make a habit of browsing the pict-rs datastore regularly. There’s currently no easy way to moderate or delete images in the pict-rs datastore.

I don’t think I need to elaborate more on what kinds of images could be lurking on your very own server which could be hotlinked into VERY dark places on the web. Saying that you “didnt know” that they were there is not a defense. When the authorities are knocking on your door because you are hosting illegal images you will be sorry that you didn’t take a more active role in your user base.

I realize that even if you close or set manual registration that there’s still the danger of a bad user doing this very thing, but I think putting in some minor hurdles would greatly decrease the chances.

Regardless, I think there needs to be a better way to manage the pict-rs part of Lemmy and an easy way for admins and instance owners to be able to view EVERY attachment on their server to make sure there’s nothing there that could get them in trouble.

I run a small instance that just have people I personally know on it, so it’s not a worry for me. But the larger instances that are opening signups to strangers should be aware and take precautions.

  • @entropicshart
    link
    English
    18
    edit-2
    1 year ago

    Honestly this sounds like a bug that needs to be fixed to prevent image uploads before the post is committed.

    Furthermore, adding abilities to scan uploaded content for inappropriate images could help auto remove such content that makes it past moderation, or even before moderation

    • @[email protected]
      link
      fedilink
      English
      1
      edit-2
      1 year ago

      Yeah, definitely sounds like a bug. The fix doesn’t even have to prevent image uploads before post commit, it just needs a way to periodically cull images that don’t have any posts / comments referencing it. This seems like necessary functionality regardless to prevent server storage requirements from ballooning out of control and not to mention for CCPA and GDPR purposes.

    • AtomHeartFatherOP
      link
      fedilink
      English
      9
      edit-2
      1 year ago

      Seems like the best thing to do would be to run that on a daily schedule and also ideally something done in the ui. I worry for those admins that just “followed the recipe” to get a Lemmy instance up and running but lack any real sysadmin ability.

      I think theres probably a big overlap between the novice admins and the instances where the admins are unaware they are getting flooded with bot registration.

      • dekatron
        link
        fedilink
        English
        51 year ago

        I agree. There’s has been a lot of activity and interest from new contributors on the Lemmy GitHub lately. Hopefully these issues will get sorted out as Lemmy grows.

  • @nivenkos
    link
    English
    131 year ago

    How does closed registration stop this? How do you magically detect trolls (or even bots?)?

    The only fix is the ability for admins and moderators to permanently delete content and that shouldn’t be a problem.

    At the end of the day the admins can just SSH into the server and delete it. It’s not on the blockchain.

    • @Aztech
      link
      English
      41 year ago

      Based on what you said, I searched for some tools and found this topic in GitHub: nsfw-detection , it could be a first line of defense.

      • tubbadu
        link
        English
        1
        edit-2
        1 year ago

        deleted by creator

  • @[email protected]
    link
    fedilink
    English
    81 year ago

    It’s not as big of a legal issue as you may think. As long as you respond to requests to delete such things promptly, and can show you didn’t encourage such behavior.

    But definitely be aware of it.

  • tubbadu
    link
    English
    31 year ago

    A possibile solution might be to link the post uploading the image to the image itself, and if the post gets deleted the image should be as well deleted, or “hidden” somewhere if the post should be un-deletable. This way removing the post wouldn’t allow people to still view that image, hence it would be unuseful this “trick”

  • Maiznieks
    link
    English
    11 year ago

    Needs an app that can identify hotdogs