@return2ozma to Mental HealthEnglish • 3 months agoPeople are using ChatGPT for therapy—but is it a good idea?www.newsweek.comexternal-linkmessage-square12arrow-up128arrow-down13cross-posted to: aboringdystopia
arrow-up125arrow-down1external-linkPeople are using ChatGPT for therapy—but is it a good idea?www.newsweek.com@return2ozma to Mental HealthEnglish • 3 months agomessage-square12cross-posted to: aboringdystopia
minus-square@[email protected]linkfedilinkEnglish8•3 months agoYeah, wotsisname’s Law of Headlines. If it ends in a question mark the answer to the question is no
minus-square@[email protected]linkfedilinkEnglish5•3 months agoBecause too frequently it gives plausible-sounding but completely unfounded statements. Also it can go more darkly wrong, and all the extra checks and safeguards don’t always protect it.
minus-square@[email protected]linkfedilinkEnglish1•6 days agoBecause a human can understand the situation, and the person they’re talking to, and reply with wisdom, rather than just parroting what seems like what they heard before.
minus-square@[email protected]linkfedilinkEnglish0•3 months agoTherapy isn’t about what the therapist says
minus-square@[email protected]linkfedilinkEnglish4•3 months agoSome of it is, as I can personally attest. And well-dressed lies can certainly do a person much harm.
No.
It is not.
Yeah, wotsisname’s Law of Headlines. If it ends in a question mark the answer to the question is no
Betteridge
Why not
Because too frequently it gives plausible-sounding but completely unfounded statements.
Also it can go more darkly wrong, and all the extra checks and safeguards don’t always protect it.
Why is this different than talking to a human
Because a human can understand the situation, and the person they’re talking to, and reply with wisdom, rather than just parroting what seems like what they heard before.
Therapy isn’t about what the therapist says
Some of it is, as I can personally attest. And well-dressed lies can certainly do a person much harm.