An Amazon chatbot that’s supposed to surface useful information from customer reviews of specific products will also recommend a variety of racist books, lie about working conditions at Amazon, and write a cover letter for a job application with entirely made up work experience when asked, 404 Media has found.

  • @dumpsterlid
    link
    English
    6
    edit-2
    9 months ago

    When we talk about political conservatives being opposed to biased LLMs, it’s mostly because it won’t tell them that their harmful beliefs are correct

    “What because I think Islam is inherently a violent religion now this chatbot is telling me I AM the one with violent and harmful beliefs???” - some loser, maybe elon musk or maybe your uncle, who cares.