I was recently thinking about how amazing it is that with this decentralized community we would have no censorship from big corporations and then I asked myself: what about illegal content? The kind of content that really should not be shared? As an example, what if someone creates a Lemmy instance and starts creating a community around CP? Or human trafficking? How do we deal with it? I know that instances can choose with whom they can access the content, so if most popular instances blacklist that “illegal” instance its content wouldn’t be easily visible, but it would still be in the Fediverse. Also, will all popular instances have to be quick to blacklist these “illegal” instances? Isn’t that a little to difficult? If we go the other way, where they create a whitelist, wouldn’t that harm small legit instances? Is there a plan to fight off illegal content?

  • Flax
    link
    fedilink
    English
    29
    edit-2
    1 year ago

    Same with the Internet. There aren’t any real open human trafficking or child porn sites online. Most hosting companies won’t host them, most protection companies won’t hold off ddos attacks and if they tried to do self hosting, their jurisdiction may blast down their doors and raid them. If somehow all of that doesn’t happen, countries will block them anyway and if not that, then communities can defederate from them.

    Stuff like Discord is more of a risk as it is bigger so less moderation resources and it’s private

    • @JoshNautesOP
      link
      English
      21 year ago

      Yeah I thought about not having that kind of content on the internet. Well… there is but as you said it is either raided by the host (be it a company or a country) or we don’t really have access to it because it does not show up on Google so we have no way of knowing it even exist unless we look for it. But, as my experience with Lemmy is that my homepage is filled with many posts from different instances that I never knew existed. What if an instance sharing “bad content” shows up on my homepage? What is the next step that I can take?

      • @Asafum
        link
        English
        21 year ago

        I suppose you could report it to whatever the equivalent of the FBI is in your country if you aren’t in the US and block the instance. Not sure what else we can do about other people’s instances.

        I think it should all exist and then face the consequences of their respective countries laws as people report them, the content will always exist in some form in some place so at least this way we leave things as open as possible while still dealing with illegal things “properly.”

        • @JoshNautesOP
          link
          English
          21 year ago

          Wait I can block instances from showing up on my Lemmy.world homepage without having to ask Lemmy.world to block it? Is this a Lemmy specific feature? Or is this a common Fediverse feature? Will this also block that instance from showing up on my search? As an example: If I search for the abbreviation of CyberPunk and the “illegal CP” shows up on my search, I can also block that instance from ever showing up on my search?

          • @Asafum
            link
            English
            31 year ago

            I’m new to all of this so I’m not 100% sure about other ways of accessing the fediverse, but I’m on the mobile app Jerboa and if I click on the three dots next to a post before opening the post there’s an option to block instance or block the individual that made the post.

            So far I’ve just blocked Joe Rogan, I’m tired of his nonsense lol

            • @JoshNautesOP
              link
              English
              21 year ago

              I think he will be my first block as well lol

        • Flax
          link
          fedilink
          English
          1
          edit-2
          1 year ago

          Report it to your own instance to have it blocked, I’d say. Instances can block other instances, so will block their users from accessing it and accessing their home instance

  • @fubo
    link
    English
    18
    edit-2
    1 year ago

    As usual, it’s the same as email. There will need to be various sorts of spam filtering developed in order to keep the platform usable. In the meantime — if you see it, report it and delete it.

    Suppose you open up your email and you see that you’ve received a piece of spam that contains CSAM (CP). You have not committed a crime — but you also mustn’t keep it. So you report the spam to your email provider, and you delete it from your mailbox. If you’re very diligent maybe you report it to NCMEC.

    Suppose you run an email server. You’re aware of the existence of spam (alas!) and you do your best to block spam using various technologies ranging from DNSBLs to ML classifiers. If someone on the Internet sends spam containing CSAM to a user on your server, you didn’t send it; they did. The sender committed a crime. Your spam filters just didn’t catch that particular instance. So when your user reports it to you, you improve your spam filters. And you delete it.

    Suppose you run an email server. Your spam filters might include a reputation score for other email servers. When your filters notice that a large fraction of the messages from a particular server are spam, they switch to automatically block all mail from that server. Then even if that server tries to send spam to your users, the offending messages never even hit your server’s disk.

    Expect that as this platform matures, it will need many of the same sorts of spam-mitigation technology that email and other federated services have used.


    I’m repeating “and you delete it” once again because that’s important. You mustn’t retain copies of illegal files even as training data for your spam classifiers. The big email providers & social media companies go to a bunch of effort to keep data about CSAM files, without having to keep the actual files.

  • AlmightySnoo 🐢🇮🇱🇺🇦
    link
    English
    16
    edit-2
    1 year ago

    If it’s still on the internet you report them to law enforcement. But I’d bet that those intent on hosting those kinds of materials would have already started their own instances on the dark web and there sadly isn’t much we can do in that case. Only way to deal with them seems to be law enforcement’s approach of trapping those predators by posing as clients but then again, that’s their job, not yours. What you can do is 1- defederate, 2- warn other instance admins and 3- report to the police.

    • @JoshNautesOP
      link
      English
      51 year ago

      Which is the the scenario I talked about of having a “blacklist”. You saind on step 2 to “warn other instance admins”. As I see it I would have to first, know who are the admins of every popular instance, then I have to manually warn them one by one, and that is assuming I did not forget any. And we are not even talking about other Services like Mastodon that could communicate with this “illegal content” will I have to warn the admins of the instances there as well? I think what I’m asking is: Is there a way to easily do this? A report system not for a local community, but for the Fediverse itself? And on step 3 you said: “report to the police”. What would my local police be able to do with a server running on a random country anywhere in the world?

      • Ech
        link
        English
        61 year ago

        As I see it I would have to first, know who are the admins of every popular instance

        Every instance will have a list of admins in the sidebar on the “main page” (the page the pops up when you just type in the instance domain). And you aren’t warning every admin. Just the admin of the instance you are looking from and (if you want) the admin of the instance you found the illegal material on.

        What would my local police be able to do with a server running on a random country anywhere in the world?

        “Law enforcement” would be a better way to phrase it. Your countries higher level law enforcement should have a way to report such things and, one would hope, would have a way to pass that information on to the relevant agency with jurisdiction.

        As for a way of reporting a community/instance to the “Fediverse”? Not really. The whole point is that everything is decentralized. It’s up to each instance to decide what is unacceptable for them.

  • @[email protected]
    link
    fedilink
    English
    9
    edit-2
    1 year ago

    Fortunately images and thumbnails uploaded by remote users are hosted on the remote instance, not yours.

    You should focus on making sure that your instance’s communities stay within the legality of your country, as well as flag and deal with any illegal behavior by your users.

    For anything remote you have two options:

    1. Block the whole instance (not recommended unless it’s clear that the whole instance is dedicated to something that’s illegal in your country or if they host something incredibly disturbing)

    2. Click on “Remove” on the main page of a remote community. This will remove the remote community and make it inaccessible to local users but keep you federated to their instance.

    You cannot control what other servers do. There will be servers out there hosting illegal stuff. But that’s not something you or I need to fix, that’s where law enforcement needs to be involved. The only thing you can do is block. If it’s something serious like CP or human traficking, grab any logs you might have from them, report to authorities, purge content from database and block instance.

    • @Zak
      link
      English
      61 year ago

      Fortunately images and thumbnails uploaded by remote users are hosted on the remote instance, not yours.

      True for Lemmy. False for kbin and Mastodon which create local copies.

        • @Zak
          link
          21 year ago

          Instance owners should be OK as long as their local laws have some sort of reasonable platform immunity.

    • Kierunkowy74
      link
      fedilink
      41 year ago

      Dear /kbin admins and users:

      Fortunately images and thumbnails uploaded by remote users are hosted on the remote instance, not yours.

      True for Lemmy, false for /kbin. Example meme post from lemmy.ml - the image has been fetched and is present on kbin.social’s database.

    • @JoshNautesOP
      link
      English
      11 year ago

      I’m sorry if I’m asking dumb questions, I’m new to the Fediverse. As I understand we are talking as if I have my own instance. But what about "public’ instances like Lemmy.world? What if I’m scrolling through my Lemmy.world homepage and then that kind of content shows up?

      • @foggenbooty
        link
        English
        31 year ago

        What he’s saying is that Lemmy.world does not have to worry about the illegal content being stored on their server, and if you had your own instance the same would apply to you. It’s the instance hosting the content that is dubious.

        The fediverse is open to all by default, and if you choose to set your homepage to list “all” then you open yourself up to “all”. As other have mentioned you should be reporting anything illegal too see so that the instance owner (or yourself if you self host) can block that content or defederate the offending instance and report to the authorities if needed.

        You can’t have a safe space free from anything you oppose while also having pure freedom. There’s always a balance.

        • @JoshNautesOP
          link
          English
          2
          edit-2
          1 year ago

          Is there an easy way to report an instance to admins so that it can be blacklisted? What about other instances? Is there an easy way to warn to all instances about it? And other Services like Mastodon or kbin? Would they all have to be warned individually? If there isn’t an easy way of doing this, shouldn’t it exist? A way for users to to report a piece of content to all instances so they can choose if they would block it or not?

    • @cats
      link
      English
      51 year ago

      I think the morality of legality is subjective, but isn’t legality itself objective?

      • Ech
        link
        English
        31 year ago

        If law makers, judges, lawyers, and law enforcement officials were all perfect? Yes, it would be. That is not a world we live in, though.

      • RoboticMask
        link
        fedilink
        21 year ago

        I don’t think so, because otherwise judges would never have difficulties judging some known behaviour.

    • @JoshNautesOP
      link
      English
      31 year ago

      Yeah, I totally understand that. As a personal example: I’m against its civil usage of firearms, if someone is using the Fediverse to sell them, who am I to say that that is illegal? It might be illegal where I live, but maybe it is legal where they live, we can’t really be judges on these kind of topics. I used the term “illegal” because I couldn’t find a better term to describe those kind of subjects that (hopefully) 99.99% of people would totally NOT be okay with it showing up on their homepage, like the two examples I provided. What is the plan for that?

      • @foggenbooty
        link
        English
        31 year ago

        I think we’re all hoping it will sort itself out. The gun example isn’t that great but let’s use CP or trafficking as you first mentioned. All the sites that have such things on the internet today (and they unfortunately do exist) have been driven underground to avoid being policed. The same should theoretically happen in the fediverse.

        If someone was going to create a CP focused Lemmy, they’re technically able to do so as the software is open source, and they’re technically able to federate it with other servers, and it could technically show up if you filter by all. However I think this is very unlikely because it would bring attention to that instance and hopefully a response from law enforcement. You don’t want to run an underground operation in the light.

        So the plan is report anything you stumble upon and historically that should work well enough to push it out of public view. Now if you’re talking about stopping these unfortunate crimes, that’s a different story all together and no one has a solution.

        • @JoshNautesOP
          link
          English
          11 year ago

          I guess your answer made me a little less nervous about this. It would make sense that an illegal instance would not federate itself. Thank you kind stranger. There is just one more thing that I’m curious about. You said “report anything”, when I click the “report” button on a post, to who is that report being sent? The admin of the instance that the content is hosted or the admin of the instance I am hosted?

      • Flax
        link
        fedilink
        English
        31 year ago

        Would they be willing to sell them to you, though? I keep getting local drug dealers trying to interact with me on Instagram. I don’t see how it would be worse on the Fediverse. Even some reddit subs have drug dealers advertising

  • minnix
    link
    fedilink
    English
    31 year ago

    As an example, what if someone creates a Lemmy instance and starts creating a community around CP? Or human trafficking? How do we deal with it?

    Can you elaborate on who “we” is?

    • LUHG
      link
      English
      21 year ago

      Instances. Like Lemmy.world or beehaw.

      We’d need a global blacklist flag, something that instance owners can maintain. Then they can decide what to allow or block.

  • @[email protected]
    link
    fedilink
    English
    31 year ago

    For the user, all you can do is report and block. It is ultimately up to the mods and admins of their individual communities to remove bad content.