Summary MDN's new "ai explain" button on code blocks generates human-like text that may be correct by happenstance, or may contain convincing falsehoods. this is a strange decision for a technical ...
And if people are asking the stupid AI for things it’s exactly because people don’t know about a subject, so there’s no way for the ones that are asking to validate the information so people are fed bad information and believe it’s the truth.
And if people are asking the stupid AI for things it’s exactly because people don’t know about a subject, so there’s no way for the ones that are asking to validate the information so people are fed bad information and believe it’s the truth.