I got baited into posting a picture of a child eating Popcorn on Discord, not knowing it was associated with CSAM. The account got banned, but I dont care about it but more about the legal consequences. Has anyone heard of legal action against people posting it?

    • @SpatchyIsOnline
      link
      610 hours ago

      From what other people have said and from the occasional video that’s popped up on Youtube, Discord has a library of CSAM content that its automated systems match against and there are certain individuals that try to bait people to post seemingly innocent pictures that are actually frames from said videos. Discord’s systems see that the image is a frame from such material and will auto-ban the account

      • @[email protected]
        link
        fedilink
        48 hours ago

        This is fascinating and I have a bunch of questions, basically all centered around the fact that possession of the such content is outlawed. I don’t exoect OP to know, but maybe someone else does:

        Isn’t it illegal to have a library of such content? Is there a legal carveout for that, like Coca Cola importing cocaine?

        How is the library compiled, maintained, and added to?

        Is the library specific to Discord or is it a shared library maintained by some centralized “authority” or developer? If it’s specific to Discord then can we assume there are many different libraries of illegally produced and possessed content compiled and maintained by various social media companies? Who’s got that job? Do they get therapy in their benefits package?

        • Here_for_the_dudesOP
          link
          fedilink
          28 hours ago

          As far as I understand they use a tool called PhotoDNA (AI company acquired by Discord) which they use to scan pictures.