Researchers say that the model behind the chatbot fabricated a convincing bogus database, but a forensic examination shows it doesn’t pass for authentic.
Researchers say that the model behind the chatbot fabricated a convincing bogus database, but a forensic examination shows it doesn’t pass for authentic.
I don’t understand why this is surprising or even unexpected. LLMs are not intelligent. They do not actually “know” anything. They are simply programmed with grammar rules and what words in what order tend to make people happy. Of course it’s going to make stuff up because it has no concept of what is real or what is made up. Those concepts don’t mean anything to it. It puts words into vaguely grammatically correct sentences. That’s it.
I’m already so tired of hearing about these things. If LLMs actually had any amazing capabilities or could change the world they wouldn’t be being sold to the public. This is all just a marketing blitz for what will probably end up like cryptocurrencies, a niche thing that does one specific thing very well but otherwise not generally useful.
I think the issue with this is that peer reviewers at academic journals are just regular researchers at regular institutions who are volunteers/voluntold to review things. They don’t do forensic examinations of the raw datasets that come across their desk because they’re not getting paid to review in the first place, and forensic data examination is a specialized skill anyway. So if the bullshit engines known as LLMs are just convincing enough and can generate supporting data that’s just good enough to pass a peer reviewer’s smell test, that’s going to be a big problem for the whole publishing process worldwide.
I would say that academic journals are going to have to hire data scientists to verify that datasets are genuine before they’re sent to reviewers, but that would require them actually spending money to do any work instead of just doing nothing while extorting researchers for billions of dollars in free money, so that’s never going to happen.