• @qooqie
    link
    English
    8
    edit-2
    1 year ago

    So this study kind of hints at it, but their (facebooks) algorithm worked exactly as intended. Showing users the stuff they want to see and that will keep them engaged. This has the unfortunate down stream consequence of radicalization of the consumer. I don’t recall the term exactly, but there’s a psychology term for groups that form and over time the members in the group tend to radicalize more and more. Us humans just did human things and we didn’t think about the group we’re trying to appeal to. You can see this on any left leaning political sub too. Rhetoric in both groups tends toward extreme especially with highly curated (eg. algorithms or upvotes) feeds. I don’t know how we can escape this other than cutting out constant political news out of our lives (which I have done and am much happier for).

    • @Dark_Blade
      link
      English
      31 year ago

      The only thing you can do, really, is to take ‘detox’ breaks and perform self-assessment every now n’ then. I’ve noticed my own leanings shift towards extremes and it’s very hard to pull yourself out of it.

      • @qooqie
        link
        English
        31 year ago

        Same man! I, for awhile before Reddit shit the bed, decided I could handle politics again and unblocked politics subreddit and fuck. Quickly pulls you in towards an extreme. I saw what was happening and detoxed like you said, but damn I wish more people could look in the mirror every now and then

        • @Dark_Blade
          link
          English
          21 year ago

          The problem is that very few people have the self-awareness required to understand what’s happening to them, especially when extremes are so…alluring. Nuance requires way too much higher-level thinking, and our monkey brains just love ‘thing good, other thing BAD’.

          Also, there’s the comfort in belonging to a group of some sort. When there’s so many people who agree with so many of your beliefs, you see the extreme stuff they say and wonder to yourself ‘hm, but what if they’re right about this too?’

  • @Zummy
    link
    English
    11 year ago

    It’s not that complicated The answer is, yes they did. They were aware that articles existed that contained false information, and they didn’t remove them. Leaving articles spreading lies, would have definitely led to more division in the 2020 election.

  • @oblique_strategies
    link
    English
    1
    edit-2
    1 year ago

    The medium is the message. The content is irrelevant. The scale, shape, symbolic meaning, targeted audience, access & interaction, duration of exposure, perceived authority (or subversive nature) of “shared” information, and saturation in culture and life made possible by the pipe that spews the content is likely far more responsible than the exact nature of the content users engaged with. Encoded alongside the content the characteristics of the platform are what shapes and controls human behavior, not just the content itself