@[email protected] to [email protected] • 2 days agoRunning Generative AI Models Locally with Ollama and Open WebUI - Fedora Magazinefedoramagazine.orgexternal-linkmessage-square19fedilinkarrow-up178arrow-down118
arrow-up160arrow-down1external-linkRunning Generative AI Models Locally with Ollama and Open WebUI - Fedora Magazinefedoramagazine.org@[email protected] to [email protected] • 2 days agomessage-square19fedilink
minus-squareThe Hobbyistlinkfedilink4•2 days agoAlternatively, you don’t even need podman or any containers, as open-webui can be installed simply using python/conda/pip, if you only care about serving yourself: https://docs.openwebui.com/getting-started/quick-start/ Much easier to run and maintain IMO. Works wonderfully.
minus-square@[email protected]linkfedilink8•2 days agoAnd llamafile is a binary you can just download and run, no installation required. “Uninstallation” is deleting the file.
Alternatively, you don’t even need podman or any containers, as open-webui can be installed simply using python/conda/pip, if you only care about serving yourself:
https://docs.openwebui.com/getting-started/quick-start/
Much easier to run and maintain IMO. Works wonderfully.
And llamafile is a binary you can just download and run, no installation required. “Uninstallation” is deleting the file.