in response to Bender pointing out that ChatGPT and its competitors simply encode relationships between words and have no concept of referent or meaning, which is a devastating critique of what the technology actually does, the absolute best response he can muster for his work is “yeah, but humans don’t do anything more complicated than that”. I mean, speak for yourself Sam: the rest of us have some concept of semiotics, and we can do things like identify anagrams or count the number of letters in a word, which requires a level of recursivity that’s beyond what ChatGPT can muster.
I don’t think those have anything to do with recursion (unless “recursivity” means something else?), they’re just a consequence of the token system ChatGPT is using.
Boom Shanka (emphasis added)
I don’t think those have anything to do with recursion (unless “recursivity” means something else?), they’re just a consequence of the token system ChatGPT is using.
Anagrams are much more trainable for an LLM with smaller token size. The downside is that increases complexity.