Hi, I am a guy in early thirties with a wife and two kids and whenever I go on youtube it always suggests conservative things like guys dunking on women, ben shapiro reacting to some bullshit, joe rogan, all the works. I almost never allow it to go to that type of video and when I do it is either by accident or by curiosity. My interest are gaming, standup comedy, memes, react videos, metalurgy, machining, blacksmithing, with occasional songs and videos about funny bullshit and I am not from america and I consider myself pretty liberal if I had to put it into terms used in america. But european liberal, so by american standards a socialist. Why does it recommend this shit to me, is this some kind of vector for radicalization of guys in my category? Do you have similar experience?

  • @[email protected]
    link
    fedilink
    31 year ago

    If they piss you off, you will stay on their platform longer, and they make more money.

    That is the sad truth of EVERY social network.

    Lemmy might not be that advanced yet, but as soon as they get big enough to need ads to pay for bandwidth and storage, soon after they will add algorithms that will show you stuff that pisses you off.

    One way to combat this is to take a break from the site. Usually after a week, when you come back it will be better for a while.

      • @[email protected]
        link
        fedilink
        11 year ago

        Who is hosting this? Lemmy.ml, all the federated sites? With the reddit exodus there is probably a lot more activity. Who’s paying for that? That’s who they would be in this situation I think.

    • @[email protected]
      link
      fedilink
      2
      edit-2
      1 year ago

      I think it has more to do with the stuff you watch than wanting to piss you off.

      All YouTube recommends to me are videos of kpop, dog grooming, Kitten Lady, and some Friesian horse stable that went across my feed once. Oh, and some historical sewing stuff.

      If they started recommended stuff that pissed me off, I wouldn’t bother going back except for direct videos linked from elsewhere.

      Edit: Rereading what OP said they watch, their interests are primary interests of the right wing in the US. If they don’t train the algorithm they don’t want it, the algorithm doesn’t know that those interests don’t intersect.