It seemingly has little impact. I’ve attempted to use LLMs a couple of times to ask very specific technical questions (on this specific model, running this specific OS version, how do I do this very specific thing) to try and cut down on the amount of research I would have to do to find a solution. The answer every time has been wrong. Once it was close enough to the answer I was able to figure it out but “close enough” doesn’t seem worth bothering with most of the time.
When I search for things I always slip the AI summary at the top of the page.
It seemingly has little impact. I’ve attempted to use LLMs a couple of times to ask very specific technical questions (on this specific model, running this specific OS version, how do I do this very specific thing) to try and cut down on the amount of research I would have to do to find a solution. The answer every time has been wrong. Once it was close enough to the answer I was able to figure it out but “close enough” doesn’t seem worth bothering with most of the time.
When I search for things I always slip the AI summary at the top of the page.