• Madison_rogue
      link
      fedilink
      72
      edit-2
      1 year ago

      Seriously though, she chose a show that was randomly chosen by the algorithm, she watched it, and more content of that type was suggested to her by the algorithm.

      This isn’t quite rocket science.

        • @[email protected]
          link
          fedilink
          English
          141 year ago

          Has this story ever been confirmed by Target directly? As this happened in America and her father was outraged about it, it would have been awfully convenient, to “blame” the algorithm for “discovering”, she was pregnant. It takes quite a data analyst to figure out trends before someone even knows they are pregnant. It doesn’t take a genius to figure out a pattern for someone if they know they are pregnant and are just hiding it from their dad.

          • @what_is_a_name
            link
            171 year ago

            Yes. It’s many years in my past, but this was confirmed. Target still does their targeting but now scatter unrelated items in the ads to hide what they know.

        • Madison_rogue
          link
          fedilink
          91 year ago

          They didn’t figure anything out. There’s no sentience in the algorithm, only the creators of said algorithm. It only chose content based on input. So it all revolves around the choices of the article’s author.

          Same thing with the woman who was pregnant, the algorithm gave choices based on the user’s browsing history. It made the connection that the choice of product A was also chosen by pregnant mothers, therefore the shopper might be interested in product B which is something an expecting mother would buy.

            • Madison_rogue
              link
              fedilink
              71 year ago

              Sorry, I misunderstood your tone. Apologize for going all pedantic…it’s a character flaw.

          • @[email protected]
            link
            fedilink
            English
            31 year ago

            I believe in case of the pregnant women she was offered diapers and stuff. Based on food she bought. So it’s no simply “you both diet coke, maybe try diet chocolate?”. In case of Netflix there’s no " A show only gay people watch" so her complaints are silly.

  • EnderWi99in
    link
    fedilink
    381 year ago

    Because you watched stuff that a lot of gay people watched and then watched more stuff the algorithm suggested based on your previous watch history. It’s not magic or anything.

  • AutoTL;DRB
    link
    fedilink
    English
    81 year ago

    This is the best summary I could come up with:


    “Big data is this vast mountain,” says former Netflix executive Todd Yellin in a video for the website Future of StoryTelling.

    Facebook had been keeping track of other websites I’d visited, including a language-learning tool and hotel listings sites.

    Netflix told me that what a user has watched and how they’ve interacted with the app is a better indication of their tastes than demographic data, such as age or gender.

    “No one is explicitly telling Netflix that they’re gay,” says Greg Serapio-Garcia, a PhD student at the University of Cambridge specialising in computational social psychology.

    According to Greg, one possibility is that watching certain films and TV shows which are not specifically LGBTQ+ can still help the algorithm predict “your propensity to like queer content”.

    For me, it’s a matter of curiosity, but in countries where homosexuality is illegal, Greg thinks that it could potentially put people in danger.


    I’m a bot and I’m open source!