Patients explaining they liked what they heared - not if it is correct or relevant to the cause. There is not even a pipeline for escalation, because AIs don’t think.
You can’t say “Exactly” when you tl;dr’d and removed one of the most important parts of the article.
Your human summary was literally worse than AI 🤦
I’m getting downvoted, which makes me suspect people think I’m cheerleading for AI. I’m not. I’m sure it sucks compared to a therapist. I’m just saying that the tl;dr also sucked.
Patients explaining they liked what they heared - not if it is correct or relevant to the cause. There is not even a pipeline for escalation, because AIs don’t think.
Exactly. AI chatbot’s also cannot empathize since they have no self awareness.
but it can give the illusion of empathy, which is far more important.
You can’t say “Exactly” when you tl;dr’d and removed one of the most important parts of the article.
Your human summary was literally worse than AI 🤦
I’m getting downvoted, which makes me suspect people think I’m cheerleading for AI. I’m not. I’m sure it sucks compared to a therapist. I’m just saying that the tl;dr also sucked.