Neat, but isn’t it disingenuous to describe a LLM as a “personal AI” if it cannot be run locally?
If our future is going to rely on LLMs, which if obviously will, we’re set up for a real dystopia unless LLMs can run locally made without giving our whole lives to the corporation behind the chatbot.
Does the AI community have any idea on when a local LLM of GPT-3.5 class performance will be possible?
Neat, but isn’t it disingenuous to describe a LLM as a “personal AI” if it cannot be run locally?
If our future is going to rely on LLMs, which if obviously will, we’re set up for a real dystopia unless LLMs can run locally made without giving our whole lives to the corporation behind the chatbot.
Does the AI community have any idea on when a local LLM of GPT-3.5 class performance will be possible?