(it was the free online version, using a MS account)
this AI doesnt know stuff and then proceed to go " why do you think this is?" “could you find other information , so that i can help you”. I’m not sure if it convey how bad it sound. May be that the whole exchange which has that condescending tone.
I tried a lot of llm , I never feel so aggravated by the way one of them talk.
How did they program that stuff to make it so bad… did they try to give it personnality? And if MS roll that up in all its W11 app, I bet people will start destroying their computer out of frustration… may be it is their plan all along.
It seems I cannot copy more than 1 exchange at the time!! Another shitty feature from MS. Here is one:
“That’s interesting!”??? what? I dont care if it is or not. Stop pretending you have interest. Then, you pretend to know something (“It could indeed be from Indian mythology”), without saying anything new.
And the question at the end of every answer in that exchange, grate on you very fast :“Does any part of this story resonate with a particular culture or tradition you’re familiar with?”… You’re supposed to provide answers, not question. Plus what the point of that question? Just ask what detail I can add to help Copilot to find an answer, not this condescending “are you familiar with”
Those are little things, but after 5 exchanges with the same ton it was insufferable.
Any normal llm would say just the fact:
I get this same pattern from ChatGPT-4o a lot. I sometimes find it a little off-putting as well