• @[email protected]
        link
        fedilink
        12
        edit-2
        2 days ago

        Ew gross.

        I’m not going to keep the scalps of any Nazi I kill while defending my home and loved ones.

        I’ll just use pen and paper to keep track.

        (I’m not bothered by your comment at all, but am attempting to humorously “yes, and” with it.

        I am attempting a homorous misdirect where the reader thinks I’m disgusted by threatening to kill Nazis, but then I’m actually just offended by inefficient messy ways of keeping tracked track of any killed Nazis.)

        • @Sidhean
          link
          162 days ago

          The second paragraph of explanation being bigger than the whole joke wraps back around to being hilarious.

          • @impudentmortal
            link
            12 days ago

            But it makes me wonder if they know it’s a reference to Inglorious Basterds

        • femtech
          link
          fedilink
          32 days ago

          Yeah, I like a clean, minimalist space. No plants, not many picture frames. Definitely would need an excel spreadsheet lol

  • @lordnikon
    link
    English
    723 days ago

    This is not a story of the algorithm predicting what you like. It’s showing if you expose a human to the same content over and over again. It can change their way of thinking in order to like the thing they are exposed to. Even more so if they don’t know how it works and the person think everyone is into slime. I need to be into it too to fit in. It’s very powerful if you want to manipulate populous. It’s algorithm induced Stockholm Syndrome.

    • @[email protected]
      link
      fedilink
      122 days ago

      You don’t even need an algorithm to do it either. I didn’t use Linux when I signed up for Lemmy…

      • @lordnikon
        link
        English
        62 days ago

        It’s not always a bad thing lol

    • @Kelly
      link
      English
      243 days ago

      It’s showing if you expose a human to the same content over and over again. It can change their way of thinking in order to like the thing they are exposed to.

      See also: top 40 radio

  • @[email protected]
    link
    fedilink
    262 days ago

    The only reference I have for this was someone who I knew who rubbed said slime on herself for YouTube when she was 17 to build a following for when she turned 18 and started camming.

  • nyahlathotep
    link
    fedilink
    English
    443 days ago

    The phrase “ok slimers” legitimately made me laugh out loud

  • @[email protected]
    link
    fedilink
    413 days ago

    I’m certain my YouTube feed is trying to radicalize me into some kind of culture warrior. It’s really annoying. I deleted all of my watch history to try and reset it and it just got way worse real quick. I watch one stupid video, now all I see are angry tubers upset that people don’t think exactly like they do and enjoy things they don’t. Then they convince themselves they’re more enlightened than anyone else because they make this content and ban anyone who makes fun of them, all while claiming to be “free speech advocates” of course.

    YouTube got bad so fast it’s left my head spinning.

    • @Jyrdano
      link
      112 days ago

      Yup, last week Ive clicked on a YT video of certain game, shut it down about 1 minute in after realising it was just another rage-baiting angry youtuber lamenting how the game is too woke. Now all I get is recommendations of angry anti-woke youtube videos bashing the game I actually enjoy.

      • @Renacles
        link
        32 days ago

        Veilguard? I keep blocking those channels.

        • @Jyrdano
          link
          32 days ago

          Yeah, I was being vague to derailing this thread off topic too much.

    • Fleppensteyn
      link
      fedilink
      62 days ago

      I started with a clean profile: I never log in to YT so it’s just using a local cookie you can always clear to start over.

      Anyways, I just searched a few sciencey things to feed the algorithm and now I’m getting loads of crazy fake “science” and conspiracies and the rest is all extremist right wing bullshit.

      YouTube is getting useless.

    • @[email protected]
      link
      fedilink
      223 days ago

      Have you tried clicking the 3 dots on these outrage videos and selecting “don’t recommend channel” or a mix of that and “not interested?” I started to see a bunch of right wing political trash in my feed a while back since a lot of my watched videos could be considered adjacent (cars/trucks/offroading/home improvement/dash cam vids/etc) to what these people like and I haven’t really had this issue again.

      • @[email protected]
        link
        fedilink
        303 days ago

        It’s wild that right wingers are always complaining about big tech censoring them when YouTube and Facebook are pushing far-right content so much

        • @[email protected]
          link
          fedilink
          10
          edit-2
          2 days ago

          It’s wild that right wingers are always complaining about big tech censoring them when YouTube and Facebook are pushing far-right content so much

          I’ve got a conspiracy theory about this:

          1. Everyone likes kittens.
          2. Some of us who like kittens think about how to act decently to each-other, some of the time.

          Leading to:

          1. Right wingers who like kittens will sometimes see something “woke” in their algorithm feed, and they feel attacked.
        • @[email protected]
          link
          fedilink
          102 days ago

          They still think that YouTube and Facebook are representative of the average person. They don’t understand how incredible curated those feeds are. I think that’s where some of the “silent majority” mythos comes from. Everything they see is people agreeing with them, therefore it’s impossible that Joe Biden got more votes in 2020.

      • @Jiggle_Physics
        link
        42 days ago

        I have done this. I have told them not to post me shit from channels, and topics, over, and over. Best it seems it can go is like 2 months. When I tried deleting my history, and turning it off, it got SO MUCH WORSE. Even making a new account was orders of magnitude worse. As sad as it is, I am actually getting a better result… I have long been at the point where I do not click on things I am not familiar with, or without suggestion from a trusted source. So I just don’t look at recommended anymore. Just look for the indicator of new stuff from my subs, or look at things I specifically search for.

    • @thedirtyknapkin
      link
      42 days ago

      you can turn off watch history all together and it will just give you generic related videos on the side.

    • Ephera
      link
      fedilink
      13 days ago

      If you want it to just not recommend things, you might prefer switching to an RSS feed, or to something like NewPipe.

  • Gil Wanderley
    link
    fedilink
    133 days ago

    And that is why I only open videos about topics I am only mildly interested on or from controversial channels in incognito mode even though I actually pay for ad-free YouTube Premium.

    On my defense, that is the only streaming service I pay for.

    • @Agent641
      link
      82 days ago

      TFW you forget to wear protection when clicking on a weird video and you permanently scar your algorithm. You try to heal it, but days or weeks later, you are showing your boss a video on marine grade industrial sealant and Chappell Roan Pink Pony Club shows up in your recommended videos and you have to lie and say you have no idea what it is. When he is gone, you play it again.

      • ObjectivityIncarnate
        link
        12 days ago

        clicking on a weird video and you permanently scar your algorithm.

        It’s trivial to delete individual videos from your watch history, even moreso if you just saw it. Doing so makes it as if you never clicked on it in the first place.

    • @ripcord
      link
      22 days ago

      Or turn off history tracking and it stops doing that shit

  • FaceDeer
    link
    fedilink
    43 days ago

    As recent advances in AI have shown, humans are really quite predictable when you throw enough data and compute at the problem. At some point the algorithm will be sophisticated enough that it’ll be able to get to know you better than you know yourself, and will be able to provide you with things you had no idea were what you really wanted.

    Interesting times.

      • Chris
        link
        fedilink
        English
        73 days ago

        Yes, I heard/saw/read that this is exactly what Amazon do, some years back now. They know who you are, what stage of life you are at, and they know what you want before you do.

      • FaceDeer
        link
        fedilink
        83 days ago

        Yes, but recent advances have really rubbed it in our faces in ways that are a lot harder to deny. Humans haven’t become fundamentally more or less predictable over time but recent advances have shown how predictable we are.

        • @[email protected]
          link
          fedilink
          22 days ago

          Yep. I learned from an algorithm that I might enjoy music by “The Beatles”. The algorithm was quite correct, but I think my having simple tastes, and the Beatles having amazing music is due most of the credit.

    • @Schmeckinger
      link
      63 days ago

      Yeah algorithms keep throwing stuff at me I would probably like to watch, but I don’t click on it to not get even more brain damage.

    • @[email protected]
      link
      fedilink
      12 days ago

      Yeah doubtful. I think it finds something you will engage in and push on it over and over again until people get normalized to it.

      I think it’s more like cold reading from a psychic. It’s gonna use generic generalized data about the big identifiers for you like age and gender and as you respond try to change its answer to what it needs to based on what you gave it.

      That’s not new or magical in any way. And it can be really wrong about the broad stuff if you don’t fit in with generic identifying groups related to you.

      It really just feels like a sales pitch for the middle class to buy more stuff.

    • @[email protected]
      link
      fedilink
      2
      edit-2
      2 days ago

      I had this exact experience with music algorithm recommendations:

      The algorithm analyzed all the songs I asked it to play, and concluded (correctly) that I might enjoy listening to the Beatles. (True story.)

      (Now a bit of sarcasm:) I look forward to future insights, in other art forms, such as perhaps the writings of Shakespeare or the paintings of Leonardo Da Vinci.

    • queermunist she/her
      link
      fedilink
      12 days ago

      That is not what happened.

      Humans aren’t static. You don’t actually have these secret hidden likes AI can discover, instead, you grow to like the stuff that becomes familiar. You’re being trained.

    • Ephera
      link
      fedilink
      13 days ago

      Problem is that none of the algorithms actually care about showing you things you like.

      Ads try to sell you on things that you wouldn’t otherwise buy. Occasionally, they may just inform you about a good product that you simply didn’t know about, but there’s more money behind manipulating you into buying bad products, because it’s got a brand symbol.

      And content recommendation algorithms don’t care about you either. They care about keeping you on the platform for longer, to look at more ads.
      To some degree, that may mean showing you things you like. But it also means showing you things that aggravate you, that shock you. And the latter is considered more effective at keeping users engaged.