• @gAlienLifeform
    link
    34
    edit-2
    1 year ago

    Those less responsible authors should be shown this study from the same organization last month showing similar problems on Twitter

    In the course of the investigation, researchers found that despite the availability of image hashes to identify and remove known CSAM, Twitter experienced an apparent regression in its mitigation of the problem. Using PhotoDNA, a common detection system for identified instances of known CSAM, matches were identified on public profiles, bypassing safeguards that should have been in place to prevent the spread of such content. This gap was disclosed to Twitter’s Trust & Safety team which responded to address the issue. However, the failure highlights the need for platforms to prioritize user safety and the importance of collaborative research efforts to mitigate and proactively counter online child abuse and exploitation.

    That being said, people who code for the Fediverse should see this report and pay particular attention to things like

    Current tools for addressing child sexual exploitation and abuse online—such as PhotoDNA and mechanisms for detecting abusive accounts or recidivism—were developed for centrally managed services and must be adapted for the unique architecture of the Fediverse and similar decentralized social media projects.

    I honestly don’t know crap about coding, but this seems like a very solvable problem and something I’d very much like for the people who do to engage with. I would absolutely donate some money to support a project like this.

    e; I guess what I meant to say is I would absolutely donate some money to purchase API keys from Microsoft

    • MeowdyPardner
      link
      fedilink
      81 year ago

      I actually just saw that Dansup is working on adding optional opt-in support for PhotoDNA in pixelfed if an instance admin adds a PhotoDNA API key, I wonder if that was spurred on by this report. Hopefully Mastodon also looks into adding support.

      • @gAlienLifeform
        link
        51 year ago

        Nice, yeah hopefully this feature or something that accomplishes the same spreads* throughout the Fediverse quickly

        *Like, it would be really cool if there was a way to fight child porn that didn’t involve relying on a for profit company, but chipping away at our screwed up economic system is a lower priority than stopping child abuse

    • fmstrat
      link
      fedilink
      41 year ago

      After a bit of reading, another option may simply be to include a “report” button that generates a hash of the image and federates the list. That being said, their may be a similarity algorithm under the hood of PhotoDNA that works better. Hard to say since it’s all proprietary and pay-for-membership. Prices aren’t even listed publicly unless you use a cloud API.

          • @gAlienLifeform
            link
            41 year ago

            That’s good, but it is still just mind blowing to me that we let a bunch of private for profit companies take the lead on this. This is the sort of thing the FBI ought to be all over developing and maintaining and handing out to everyone if they weren’t a bunch of stupid assholes busy harassing environmentalists and police brutality protesters.