- cross-posted to:
- technology
- [email protected]
- cross-posted to:
- technology
- [email protected]
By June, “for reasons that are not clear,” ChatGPT stopped showing its step-by-step reasoning.
By June, “for reasons that are not clear,” ChatGPT stopped showing its step-by-step reasoning.
This has already been disproven, due to the fact the method the researchers used to test how well it was doing was flawed to begin with. Here is a pretty good twitter-thread showing why the methods they used were flawed: https://twitter.com/svpino/status/1682051132212781056
TL:DR: They used an approach of only giving it prime numbers, and asking it if they were prime numbers. They didn’t intersperse prime and non-prime numbers to really test it’s capabilities at determining that. Turns out that if you do that, both the early and current versions of GPT4 are equally bad at determining prime numbers, with effectively no change noted between the versions.