Other samples:

Android: https://github.com/nipunru/nsfw-detector-android

Flutter (BSD-3): https://github.com/ahsanalidev/flutter_nsfw

Keras MIT https://github.com/bhky/opennsfw2

I feel it’s a good idea for those building native clients for Lemmy implement projects like these to run offline inferences on feed content for the time-being. To cover content that are not marked NSFW and should be.

What does everyone think, about enforcing further censorship, especially in open-source clients, on the client side as long as it pertains to this type of content?

Edit:

There’s also this, but it takes a bit more effort to implement properly. And provides a hash that can be used for reporting needs. https://github.com/AsuharietYgvar/AppleNeuralHash2ONNX .

Python package MIT: https://pypi.org/project/opennsfw-standalone/

  • @[email protected]
    link
    fedilink
    191 year ago

    Cool, I assume you are volunteering to go through all the pornographic content and make sure it doesn’t contain minors, and that it only involves people that consented to it, and take all legal responsibility for hosting and serving any illegal content? Great, I’ll contact all the Lemmy admins for you.

    • @[email protected]
      link
      fedilink
      -4
      edit-2
      1 year ago

      IANAL, but as far as I know you dont have to proactively remove illegal content, just the stuff you were made aware of.

      So all this drama about federating illegal content is very much overblown.

      Edit: sorry about calling it “drama”, didnt know the full extent of whats currently happening. (malicious users spamming CP)

      • Scrubbles
        link
        fedilink
        English
        9
        edit-2
        1 year ago

        Seeing how I had to become very knowledgeable because I’m an instance owner in the last few hours because of Lemmy and bad actors, this is absolutely not true.

        • @[email protected]
          link
          fedilink
          -11 year ago

          Im curious: What are the legal duties of a fediverse hoster regarding illegal content currently? Do you really have to remove illegal content proactively? Because as far as I know, thats just in the EU and only if you are one of the major digital services(which fediverse server hosters arent)

      • 𝒍𝒆𝒎𝒂𝒏𝒏
        link
        fedilink
        61 year ago

        I’m pretty sure the mods and admins of lemmyshitpost are fully aware of the illegal content being reported to them in one of the most popular Lemmy tiddeRverse communities, so i’m not entirely confident the ‘proactive removal’ info is relevant in this situation.

        If 10 volunteers can’t keep up with it, most of which have now quit, I find it really hard to see this as “drama” personally. I see it as a serious issue which has real life consequences for both the instance owner (risk of being raided) and the moderators subjected to reviewing it.

        I suspect you wouldn’t describe it as overblown if you were in the same situation as the mods. I occasionally sift through the modlog and there are occasionally some seriously vile takes in there, spam posts and abuse removed by these volunteers on a daily basis, all to keep our feeds clean. Add traumatic content on top of that too, and it’s no surprise some mods have left and they’ve shuttered the comm.

        Apologies if I come off as abrasive in this comment in general, but I just vehemently disagree with the take that this is just some “overblown drama”

        • @[email protected]
          link
          fedilink
          31 year ago

          Ah sorry, I didnt know that there is an attack going on currently, i just saw a bunch of posts about lemmy being illegal to operate because of the risk of CP federation. And then this post which seemed to imply that one needs constant automated illegal content filtering, which as far as i know isnt required by law, unless you operate a major service that is reachable in the EU, and fediverse servers arent major enough for that.

          • Scrubbles
            link
            fedilink
            English
            61 year ago

            Yeah on top of that it sounds like the people who did see it are pretty shaken up, apparently it was real fucked up. So not only blocking it from ever hitting the servers for legal reasons, but on top of that just so no one needs to see that. There are third party tools that will analyze it and block it automatically, and we’re hoping to get those online quick

      • Gamey
        link
        fedilink
        31 year ago

        Yea, just leave the CSAM till someone reports it, great solution!

        • @[email protected]
          link
          fedilink
          01 year ago

          Well, thats how it generally worked as far as I know. Im not saying that you can host illegal stuff as long as no one reports it. Im saying its impossible to know instantly if someones posting something illegal to your server, youd have to see it first. Or else pretty much the entire internet would be illegal, because any user can upload something illegal at any time, and youd instantly be guilty of hosting illegal content? I doubt it.