• @NOT_RICK
    link
    English
    2111 hours ago

    Under moderated and under administrated instances have ended up with child porn on them. If that shit gets federated out it’s a real mess for everyone. I think screening tools are more advanced now thankfully because it’s been a while since the last incident.

      • Leraje
        link
        fedilink
        English
        99 hours ago

        That means the CSAM (its not ‘child porn’ its child abuse) remains on the server, which means the instance owner is legally liable. Don’t know about you but if I was an instance owner I wouldn’t want the shame and legal consequences of leaving CSAM up on a server I control.

      • PhobosAnomaly
        link
        fedilink
        1610 hours ago

        Make a reliable way to automate that, and you’ll make a lot of money.

        Rely on doing it for yourself, and… well good luck with the mental health in a few years time.

        • FaceDeer
          link
          fedilink
          110 hours ago

          AI would be able to do a good first pass on it. Except that an AI that was able to reliably recognize child porn would be a useful tool for creating child porn, so maybe don’t advertise that you’ve got one on the job.

        • @[email protected]OP
          link
          fedilink
          English
          -16
          edit-2
          10 hours ago

          So that’s the indispensable service that admin provides. Childporn filtering.

          I didn’t realize it was such a large job. So large that it justifys the presence of a cop in every conversation? I dunno.

          • PhobosAnomaly
            link
            fedilink
            1410 hours ago

            I’ve read through a few of your replies, and they generally contain a “so, …” and a generally inaccurate summary of what the conversation thread is about. I don’t know whether there’s a language barrier here or you’re being deliberately obtuse.

            It would appear to be that your desire for a community without moderators is so strong, that a platform like Lemmy is not suitable for what you want, and as such you are likely not going to find the answer you want here and spend your time arguing against the flow.

            Good luck finding what you’re looking for 👍

          • @Zak
            link
            6
            edit-2
            10 hours ago

            If your questions are concrete and in the context of Lemmy or the Fediverse more broadly, admins provide the service of paying for and operating the servers in addition to moderation.

            If it’s more abstract, i.e. “can people talk to each other over the internet without moderators?” then my experience is that they usually can when the group is small, but things deteriorate is it grows larger. The threshold for where that happens is higher if the group has a purpose or if the people already know each other.

      • partial_accumen
        link
        1010 hours ago

        Surely filtering out childporn is something that I can do for myself.

        Even if that was a viable option as a solution (it isn’t), humans that are employed to filter out this disgusting content (and worse) are frequently psychologically damaged by the exposure. This includes online content moderation companies and those in law enforcement that have to deal with that stuff for evidentiary reasons.

        The reason its not a viable solution is if YOU block it out because YOU don’t want to see it but its still there, it becomes a magnet for those that DO want to see it because they know its allowed. The value of the remaining legitimate content goes down because more of your time is spent blocking the objectionable material yourself, until its too much for anyone that doesn’t want that stuff and they leave. Then the community dies.

        • @[email protected]OP
          link
          fedilink
          English
          -49 hours ago

          Personal cp filtering automation and a shared blacklist. That would take care of the problem. No moderator required.

          • @[email protected]
            link
            fedilink
            32 hours ago

            If you can write an automated filter to block CSAM then Apple, Meta, Alphabet and others would happily shovel billions at you. Blocking CSAM is a constant and highly expensive job… and when they fuck up it’s a PR shit storm.

            • @[email protected]OP
              link
              fedilink
              English
              134 minutes ago

              Maybe keeping it off the network is a lost cause. If we each block it with personal filtering then that changes the face of the issue.

              • @[email protected]
                link
                fedilink
                114 minutes ago

                If lemmy is a hub for those who want to to trade CSAM then it will be taken down by the government. This isn’t something that can be allowed onto the system.

          • db0
            link
            fedilink
            32 hours ago

            Personal cp filtering automation and a shared blacklist

            Oh just those, eh?

            Just goes to show how little idea you have how difficult this problem is.

            • @[email protected]OP
              link
              fedilink
              English
              150 minutes ago

              This is starting to sound like, “we need constant control and surveillance to protect us from the big bad”.

              You know, for the children.

              • db0
                link
                fedilink
                236 minutes ago

                Mate, if you don’t like the way we run things, go somewhere else. You’re not forced to be here.

                  • db0
                    link
                    fedilink
                    1
                    edit-2
                    21 minutes ago

                    Of course I see your point you’re trying to make , but I also think you’re naive and don’t understand the repercussions of what you’re suggesting

      • @NeoNachtwaechter
        link
        410 hours ago

        filtering out […] I can do for myself.

        It still means too much legal trouble for the admin if the offending data would be on the server.