Not the best news in this report. We need to find ways to do more.

  • chaogomu
    link
    fedilink
    5511 months ago

    The report (if you can still find a working link) said that the vast majority of material that they found was drawn and animated, and hosted on one Mastodon instance out of Japan, where that shit is still legal.

    Every time that little bit of truth comes up, someone reposts the broken link to the study, screaming about how it’s the entire Fediverse riddled with child porn.

    • @MyFairJulia
      link
      English
      1811 months ago

      So basically we had a bad apple that was probably already defederated by everyone else.

      • Kes
        cake
        link
        fedilink
        English
        1711 months ago

        Moreso an apple with controversial but not strictly CSAM material based in a country where it’s content is legal. Actually, not even an apple; Lemmy and the fediverse aren’t an entity. It’s an open standard for anyone to use; you don’t see the Modern Language Association being blamed for plagiarized essays written in MLA format, or the WHATWG being blamed because illegal sites are written in HTML, so it’s not a fair comparison to say that Lemmy/the fediverse are responsible for what people do with their open standard either

      • ZILtoid1991
        link
        fedilink
        711 months ago

        It’s Pawoo, Pixiv’s (formerly) own instance, which is infamous for this kind of content, and those are still “just drawings” (unless some artists are using illegal real-life references).

        • @dustyData
          link
          511 months ago

          They’re using Generative AI to create photo realistic renditions now, and causing everyone who finds out about it to have a moral crisis.

          • ZILtoid1991
            link
            fedilink
            611 months ago

            Well, that’s a very different and way more concerning thing…

            • @[email protected]
              link
              fedilink
              311 months ago

              … I mean … idk … If the argument is that the drawn version doesn’t harm kids and gives pedos an outlet, is a ai generated version any different?

      • chaogomu
        link
        fedilink
        1411 months ago

        Since the release of Stable Diffusion 1.5, there has been a steady increase in the
        prevalence of Computer-Generated CSAM (CG-CSAM) in online forums, with
        increasing levels of realism.17 This content is highly prevalent on the Fediverse,
        primarily on servers within Japanese jurisdiction.18 While CSAM is illegal in
        Japan, its laws exclude computer-generated content as well as manga and anime.

        Nope, seems to be the one. They lump the entire Fediverse together, even though most of the shit they found was in Japan.

        The report notes 112 non-Japanese items found, which is a problem, but not a world shaking issue. There may be issues with federation and deletion orders, which is also an issue, but not a massive world shaking one.

        Really, what the report seems to be about is the fact that moderation is hard. Bad actors will work around any moderation you put in place, so it’s a constant game of whack-a-mole. The report doesn’t understand this basic fact and pretends that no one is doing any moderation, and then they add in Japan.

        • @dustyData
          link
          8
          edit-2
          11 months ago

          I can’t seem to find the source for the report about it right now, but there’s literal child porn being posted to Instagram. We don’t see this kind of alarmist reports about it because it is not something new, foreign and flashy for the general public. All internet platforms are susceptible to this kind of misuse. The question is what moderation tools and strategies are in place to deal with that. Then there’s stuff like on TOR where CSAM was used as a basis to discredit the use of the whole technology then it turned out that the biggest repository was an FBI honey pot operation.

          • chaogomu
            link
            fedilink
            1011 months ago

            Buried in this very report, they note that Instagram and Twitter have vastly more (self generated) child porn than the Fediverse. But that’s deep into section 4, which is on page 8. No one is going to read that far into the report, they might get through the intro, which is all doom and gloom about decentralized content.