- cross-posted to:
- fuck_ai
- [email protected]
- cross-posted to:
- fuck_ai
- [email protected]
I’m usually the one saying “AI is already as good as it’s gonna get, for a long while.”
This article, in contrast, is quotes from folks making the next AI generation - saying the same.
Sorry then if I sound like a broken record but again, doesn’t that mean that the analogy itself is flawed? If the goal remain the same but there is close to no explanatory power, even if we do get pragmatically useful result (i.e. it “works” in some useful cases) it’s basically “just” inspiration, which is nice but is basically branding more than anything else.