@[email protected] to [email protected] • 2 days agoRunning Generative AI Models Locally with Ollama and Open WebUI - Fedora Magazinefedoramagazine.orgexternal-linkmessage-square19fedilinkarrow-up178arrow-down118
arrow-up160arrow-down1external-linkRunning Generative AI Models Locally with Ollama and Open WebUI - Fedora Magazinefedoramagazine.org@[email protected] to [email protected] • 2 days agomessage-square19fedilink
minus-square@[email protected]linkfedilinkDeutsch3•2 days agoI did try to use it on Fedora but i have a Radeon 6700 XT and it only worked in the CPU. I wait until ROCM official support reaches my older Model.
minus-square@[email protected]linkfedilink4•2 days agoI have the same setup, you have to add the line Environment="HSA_OVERRIDE_GFX_VERSION=10.3.0" for that specific GPU to the ollama.service file
minus-square@[email protected]linkfedilink3•2 days agoollam runs on the 6700 XT, but you need to add an environment variable for it to work… I just don’t remember what it was and am away from my computer right now
I did try to use it on Fedora but i have a Radeon 6700 XT and it only worked in the CPU. I wait until ROCM official support reaches my older Model.
I have the same setup, you have to add the line
Environment="HSA_OVERRIDE_GFX_VERSION=10.3.0"
for that specific GPU to the ollama.service fileollam runs on the 6700 XT, but you need to add an environment variable for it to work… I just don’t remember what it was and am away from my computer right now