@[email protected] to [email protected]English • 16 hours agoFirefox introduces AI as experimental featurelemmy.fishimagemessage-square39fedilinkarrow-up1118arrow-down110file-text
arrow-up1108arrow-down1imageFirefox introduces AI as experimental featurelemmy.fish@[email protected] to [email protected]English • 16 hours agomessage-square39fedilinkfile-text
minus-square@[email protected]linkfedilink1•5 hours agoLast time I tried using a local llm (about a year ago) it generated only a couple words per second and the answers were barely relevant. Also I don’t see how a local llm can fulfill the glorified search engine role that people use llms for.
minus-square@[email protected]linkfedilinkEnglish1•5 hours agoThey’re fast and high quality now. ChatGPT is the best, but local llms are great, even with 10gb of vram.
Last time I tried using a local llm (about a year ago) it generated only a couple words per second and the answers were barely relevant. Also I don’t see how a local llm can fulfill the glorified search engine role that people use llms for.
They’re fast and high quality now. ChatGPT is the best, but local llms are great, even with 10gb of vram.