It is expected to be 2-3 months before Threads is ready to federate (see link). There will, inevitably, be five different reactions from instances:

  1. Federate regardless (mostly the toxic instances everyone else blocks)

  2. Federate with extreme caution and good preparation (some instances with the resources and remit from their users)

  3. Defederate (wait and see)

  4. Defederate with the intention of staying defederated

  5. Defederate with all Threads-federated instances too

It’s all good. Instances should do what works best for them and people should make their home with the instances that have the moderation policies they want.

In the interests of instances which choose options 2 or 3, perhaps we could start to build a pre-emptive block list for known bad actors on Threads?

I’m not on it but I think a fair few people are? And there are various commentaries which name some of the obvious offenders.

  • Kichae
    link
    fedilink
    111 year ago

    For many people, it’s not about whether people can take the effort to see what they’ve posted online. It’s whether people who would harass them have a friction-free path to do so, and Threads is such a path. It will be all but totally unmoderated with respect to hate and harassment, and will be the biggest Nazi bar on the block.

    Protecting the vulnerable means keeping the assholes away. If we can’t care about the vulnerable, then I guess we deserve Zuck.

    • effingjoe
      link
      fedilink
      11 year ago

      Why do you think it will be unmoderated? Keep in mind I have very little exposure to Instagram and less for Threads itself.

      • Kichae
        link
        fedilink
        61 year ago

        Because effectively moderating hundreds of millions of active users is expensive and unprofitable, and because we can look at Meta’s existing platforms to see what their standards of moderation are.

        • effingjoe
          link
          fedilink
          01 year ago

          Anecdotal statements from people using Threads suggests otherwise.

          • Kichae
            link
            fedilink
            31 year ago

            I think you’re confusing “removes content that bothers the social hegemony” and moderation.

      • artisanrox
        link
        fedilink
        6
        edit-2
        1 year ago

        Because it already is.

        Facebook (owned by Meta) has a clear history of allowing deadly medical and political disinformation to spread to the point where we elected someone that sold our state secrets to the highest bidder, and millions of people died from a SARS virus.