The sole moderator doesn’t even follow their own rules: https://lemmy.ca/post/22741340?scrollToComments=true

I’ll just say it - it’s a Russian propaganda community. Is there any reason this community needs to exist on Lemmy.ca? Is there a rule against blatant astroturfing / propaganda / misinformation? I don’t think the 5 rules in the sidebar are going to be enough to stop an army of trolls:

No bigotry - including racism, sexism, ableism, homophobia, transphobia, > or xenophobia. Be respectful. Everyone should feel welcome here. No porn. Use the NSFW tag when needed. No Ads / Spamming. Bot accounts need to be flagged as such in their settings.

Maybe time to get ahead of it?

  • @[email protected]
    link
    fedilink
    English
    35 months ago

    I don’t think the lemmy.ca admins or most of it’s users want the instance to take on the responsibility/experience of being an instance where there’s a prescribed view of acceptable and unacceptable (banned) content, above and beyond objectively objectionable stuff. Curb appeal as an argument doesn’t sway me. But if curb appeal or who we’re attracting is a concern, I’d point out that most of the posts in that community are very downvoted, so to some extent Lemmy’s existing checks and balances are working as intended to limit newcomers’ exposure to a less popular community

    • @[email protected]OP
      link
      fedilink
      English
      2
      edit-2
      5 months ago

      I can respect this take. I do worry that burying problematic content isn’t enough these days though. Even if only 2% of the visitors on this site see the content, all it takes is one person to believe there’s a demonic child trafficking ring and then you have someone shooting up a pizza joint. Not everyone who uses the internet has all their faculties and I think that’s an argument for going further than just burying the content. (I suspect we’ll start seeing more pressure on YouTube and Facebook to go further than they have too with regards to problematic content like this.)

      Edit: I also think that as platforms have become more strict about their community guidelines, the effectiveness of grand, overt disinformation campaigns has diminished, so bad actors’ strategies are switching to more subtle, softer disinformation campaigns.