LughM to [email protected]English • 7 months agoEvidence is growing that LLMs will never be the route to AGI. They are consuming exponentially increasing energy, to deliver only linear improvements in performance.arxiv.orgmessage-square57fedilinkarrow-up1289arrow-down127
arrow-up1262arrow-down1external-linkEvidence is growing that LLMs will never be the route to AGI. They are consuming exponentially increasing energy, to deliver only linear improvements in performance.arxiv.orgLughM to [email protected]English • 7 months agomessage-square57fedilink
minus-square@General_EffortlinkEnglish1•7 months ago In theory there’s an inflection point at which models become sophisticated enough that they can self-sustain with generating training data to recursively improve That sounds surprising. Do you have a source?
That sounds surprising. Do you have a source?