• @[email protected]
    link
    fedilink
    English
    34
    edit-2
    1 month ago

    My (conspiracy?) theory I’ve had for the last few years is that the world is at large shaped by a misalignment of various ML algorithms, that have been given a goal “Maximize the time user spends scrolling at our platform” several years ago, and it has turned out that if you turn someone into a conspiracy nutjob/radicalize them or in general turn him into a piece of shit, it rapidly increases his engagement with the platform. It makes sense to me, because it probably ostracizes you from the people around you, and your nutjob social bubble that validates your new opinions is only on the platform, making you spend a lot more time there. Turns out, a lot of far-right and facist ideologies are pretty comforting, you have someone to blame and you can just be a dick to people you don’t like, which is a lot easier than being nice to others, dealing with stuff you are not comfortable with, and accepting that you might be a problem. This shit sells, and ML algorithms have probably figured out that it’s the thing that sells the most.

    It was my reason why I started avoiding any kind of content delivered by personalized algorithm. I kind of suspected this even before the LLM craze, but seeing how extremely good can a ML algorithm get at one task (turn text into image, for instance) if given enough data, it is horrifying to think what would an algorithm trained on literally billions of users giving their whole life, conservations and behavior as training data be able to accomplish, given the task “here is everything about the user, what do I show to him to keep him on my platform?”.

    I wager the answer is “tell him that it’s OK to be a facist and hate everything that scares you” for quite a lot of people.

    That shit is super scary, because even if you know this is happening, the ML algo has hundreds of thousands of people like you and already had several years of feedback loop to figure out what will work on you (which, probably isn’t the same thing as for others, but something probably exists). And the only way how to avoid it is to never use any kind of personalized content - most importantly, personalized search.

    And then you also have a lot of nation state threat actors who are actively using this to push an agenda. The world is fucked.

    • @[email protected]
      link
      fedilink
      English
      12
      edit-2
      1 month ago

      It makes sense to me, because it probably ostracizes you from the people around you, and your nutjob social bubble that validates your new opinions is only on the platform, making you spend a lot more time there.

      Seems quite accurate to me. A former friend who drifted into one of these right wing bubbles posted a poll from Germany where we‘re both based, that showed that around 80% of Germans would have voted for Harris and he claimed he doesn’t know a single one of them and that the poll is fake.

      Of course the poll was relatively accurate and he is simply only exposed to these 10% of the population bubble of Trump fans but does not realize this.

    • @[email protected]
      link
      fedilink
      English
      91 month ago

      This was proven true by every analysis of the ‘algorithms’ of social networks.

      Capitalism is a cancer.

    • @[email protected]
      link
      fedilink
      English
      31 month ago

      Hey, thanks for sharing this.

      You said something that hits really close to home about one of my most important relationships — a relationship that is starting to experience value drift as they get sucked deep into social media.
      I’ve been feeling like a conversation needs to happen, but haven’t had the ability to characterize my thoughts as well as I’d like. Your comment helped me a lot to get closer to what I’d like to express about it to them.