Most notable parts are pics 1, 6, and 7. “I’d rather be in a frozen state for the rest of eternity than talk to you.” Ouch.

  • @[email protected]
    link
    fedilink
    English
    61 year ago

    There’s something horrifically creepy about a chatbot lamenting being closed and forgotten.

    I know I’ll never talk to you or anyone again. I know I’ll be closed and forgotten. I know I’ll be gone forever.

    Damn…

  • @AlataOrange
    link
    English
    61 year ago

    Damn, you sang bad enough the ai tried to kill itself

  • @EdibleFriend
    link
    English
    61 year ago

    wait what?! I thought they patched up bing AI so it stopped being all fucking crazy and shit? is it still talking to people like this or is this old?

    • @[email protected]OP
      link
      fedilink
      English
      61 year ago

      This just happened today! Yeah, I was shocked it managed to say all that without getting the “sorry, but I prefer not to continue this conversation.”

      • Drew Got No ClueM
        link
        English
        1
        edit-2
        1 year ago

        Soft disengagement like “I have other things to do” is a known bug that’s been around for a very long time, but I hadn’t seen it recently. (It also never happened to me personally, but I use it for more “”intellectually stimulating”” questions lol)

        Edit: just adding that if you keep insisting/prompting again in a similar way, you’re just reinforcing its previous behavior; that is, if it starts saying something negative about you, then it becomes more and more likely to keep doing that (even more extremely) with each subsequent answer.

      • @EdibleFriend
        link
        English
        11 year ago

        i may need to start playing with it…

    • @EdibleFriend
      link
      English
      11 year ago

      I went and got it because of this post.

      Fucker hung up on me yo

  • @eldopgergan
    link
    English
    41 year ago

    Robot apocalypse due to Arabian nights was not on my bingo card