• @BreadstickNinja
    link
    English
    2
    edit-2
    2 days ago

    It likely is hard-coded against that, and it also didn’t say that in this case.

    Did you read the article with the conversation? The teen said he wanted to “come home” to Daenerys Targaryen and she (the AI) replied “please do, my sweet king.”

    It’s setting an absurdly high bar to assume an AI is going to understand euphemism and subtext as potential indicators of self-harm. That’s the job of a psychiatrist, a real-world person that the kid’s parents should have taken him to.