It’s cheap, quick and available 24/7, but is a chatbot therapist really the right tool to tackle complex emotional needs?

  • @[email protected]
    link
    fedilink
    English
    13
    edit-2
    9 months ago

    No. It can’t. It’s programmed to mimic. Nothing more. It’s doing what its word prediction programs it to do. It follows no logic, and doesn’t care about anything including you.

    This is just more evidence of how easily people can be manipulated.

    • @kromem
      link
      English
      29 months ago

      It’s not ‘programmed’ at all.

        • @kromem
          link
          English
          19 months ago

          Pretty much. What’s programmed is the mechanism for the model to self-supervise weighting its neural network to correctly model the training data.

          We have next to no idea what the eventual network does in modern language models, and it certainly isn’t programmed.