- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
cross-posted from: https://lemmy.zip/post/1386796
Archived version: https://archive.ph/F9saW
Archived version: https://web.archive.org/web/20230812233105/https://www.bbc.co.uk/news/technology-66472938
How did Netflix know I was gay before I did?
Honey
EVERYONE knew
Seriously though, she chose a show that was randomly chosen by the algorithm, she watched it, and more content of that type was suggested to her by the algorithm.
This isn’t quite rocket science.
Removed by mod
Has this story ever been confirmed by Target directly? As this happened in America and her father was outraged about it, it would have been awfully convenient, to “blame” the algorithm for “discovering”, she was pregnant. It takes quite a data analyst to figure out trends before someone even knows they are pregnant. It doesn’t take a genius to figure out a pattern for someone if they know they are pregnant and are just hiding it from their dad.
Yes. It’s many years in my past, but this was confirmed. Target still does their targeting but now scatter unrelated items in the ads to hide what they know.
target still does their targeting
Awesome sentence
It was never proven that the baby was Greg’s.
Pregante
They didn’t figure anything out. There’s no sentience in the algorithm, only the creators of said algorithm. It only chose content based on input. So it all revolves around the choices of the article’s author.
Same thing with the woman who was pregnant, the algorithm gave choices based on the user’s browsing history. It made the connection that the choice of product A was also chosen by pregnant mothers, therefore the shopper might be interested in product B which is something an expecting mother would buy.
Removed by mod
Sorry, I misunderstood your tone. Apologize for going all pedantic…it’s a character flaw.
I believe in case of the pregnant women she was offered diapers and stuff. Based on food she bought. So it’s no simply “you both diet coke, maybe try diet chocolate?”. In case of Netflix there’s no " A show only gay people watch" so her complaints are silly.
Preguntas
Because you watched stuff that a lot of gay people watched and then watched more stuff the algorithm suggested based on your previous watch history. It’s not magic or anything.
This sort of thing is just gonna happen with recommendation systems. There was a case over a decade ago where Target, the store, figured out that a teenager was pregnant before she told her family, and sent relevant mailings.
It’s possible that this story didn’t happen. Some points raised here highlight some areas we should remain skeptical.
The one that creeped me out is the fact that my parents received a sample package of baby product from Nestle, under my name a week before my wife gave birth.
Because we were living in another country, did not say anything on social media and did not go to any medical appointment in my parents country.
This is the best summary I could come up with:
“Big data is this vast mountain,” says former Netflix executive Todd Yellin in a video for the website Future of StoryTelling.
Facebook had been keeping track of other websites I’d visited, including a language-learning tool and hotel listings sites.
Netflix told me that what a user has watched and how they’ve interacted with the app is a better indication of their tastes than demographic data, such as age or gender.
“No one is explicitly telling Netflix that they’re gay,” says Greg Serapio-Garcia, a PhD student at the University of Cambridge specialising in computational social psychology.
According to Greg, one possibility is that watching certain films and TV shows which are not specifically LGBTQ+ can still help the algorithm predict “your propensity to like queer content”.
For me, it’s a matter of curiosity, but in countries where homosexuality is illegal, Greg thinks that it could potentially put people in danger.
I’m a bot and I’m open source!