@[email protected] to [email protected] • 2 days agoRunning Generative AI Models Locally with Ollama and Open WebUI - Fedora Magazinefedoramagazine.orgexternal-linkmessage-square19fedilinkarrow-up178arrow-down118
arrow-up160arrow-down1external-linkRunning Generative AI Models Locally with Ollama and Open WebUI - Fedora Magazinefedoramagazine.org@[email protected] to [email protected] • 2 days agomessage-square19fedilink
minus-square@[email protected]linkfedilinkEnglish10•edit-22 days agoI tried it briefly, but its hot garbage if you dont have potent hardware. The amount of iterations you have to do, to get proper answers and the time it takes to produce them is a waste of time.
I tried it briefly, but its hot garbage if you dont have potent hardware.
The amount of iterations you have to do, to get proper answers and the time it takes to produce them is a waste of time.