I don’t know if you need this info, but I was pretty disturbed to see unexpected child pornography on a casual community. Thankfully it didn’t take place on SLRPNK.net directly, but if anyone has any advice besides leaving the community in question, let me know. And I wanted to sound an alarm to make sure we have measures in place to guard against this.

  • AndyOP
    link
    fedilink
    English
    210 months ago

    That’s pretty shocking.

    What tools are available to us to manage this?

    • poVoqM
      link
      fedilink
      English
      1310 months ago

      The best tool that is currently available is lemmy-safty AI image scanning that can be configured to check images on upload or regularly scan the storage and remove likely csam images.

      It’s a bit tricky to set up as it requires an GPU in the server and works best with object storage, but I have a plan to complete the setup of it for SLRPNK sometimes this year.

      • @[email protected]
        link
        fedilink
        English
        310 months ago

        This is probably the best option; in a world where people use ML tools to generate CSAM, you can’t depend on visual hashes of known-problematic images anymore.