The CSAM spamming happened again, and unfortunately a local user had to witness it and report it.

I myself did not see it, but from what I gather it was basically an innocent looking tiktok video at first glance that would transition into CSAM. Utterly vile.

The content is being reported and admins across lemmy are handling it. I am hopeful this is an isolated incident but what the fuck man. What do we do from here?

  • Lung
    link
    -32 years ago

    Just wait until ~next patch and disable image uploads. Eventually, people will catch on that it’s too dangerous to leave image uploads enabled

    • gabe [he/him]OPM
      link
      fedilink
      7
      edit-2
      2 years ago

      There’s a lot of FUD (fear, uncertainty & doubt) being spread on lemmy about this entire thing that’s not helping anyone and it’s becoming extremely annoying. The way lemmy currently handles image uploads leaves a lot of room to be desired, but allowing image uploads (when safety rails are in place) isnt as dangerous as people are making it out to be.

      I’m a bit more than sick of people fearmonger that the FBI is going to raid your instance over content you had no idea about. If you live in the US and host in the US there are safe harbor protections and so long as you are doing your due diligence, reporting, etc, you’ll be fine. Almost all of the images being posted that are legally problematic are from external image hosts and it’s just a major issue due to the fact that lemmy caches ALL images (including external images) right now.

      • Lung
        link
        01 year ago

        Thanks for defining fud, it also rhymes with mud. Yeah for sure, if you want to manage a team to handle takedown requests and reply to legal threats from corporations, then go ahead and keep image uploading on

        Oh the devs are already making a feature to disable this? Shit well I guess they fell for the fud