@[email protected] to NewsEnglish • 10 months agoStudy Reveals Gender Bias in ChatGPT Translationsresearchinenglish.comexternal-linkmessage-square26fedilinkarrow-up171arrow-down117
arrow-up154arrow-down1external-linkStudy Reveals Gender Bias in ChatGPT Translationsresearchinenglish.com@[email protected] to NewsEnglish • 10 months agomessage-square26fedilink
minus-squareknightly the Sneptaurlinkfedilink9•9 months agoSo, what? You think women need their own LLMs or something? You go ahead and get started on that, the rest of us can work on making the existing ones less sexist.
minus-squareknightly the Sneptaurlinkfedilink13•9 months agoThey don’t need sentience to be sexist. Algorithmic sexism comes from the people writing the algorithms.
minus-square@[email protected]linkfedilinkEnglish17•edit-29 months agoProgrammers don’t program sexism into machine learning models. What happens is that people who may or may not be programmers provide them with biased training data, because getting unbiased data is really, really hard.
So, what? You think women need their own LLMs or something?
You go ahead and get started on that, the rest of us can work on making the existing ones less sexist.
deleted by creator
They don’t need sentience to be sexist. Algorithmic sexism comes from the people writing the algorithms.
deleted by creator
deleted by creator
Programmers don’t program sexism into machine learning models. What happens is that people who may or may not be programmers provide them with biased training data, because getting unbiased data is really, really hard.
This is a nothing argument.
They’re nuts. Easy block, IMO.