@Phoenix3875 to Programmer [email protected] • 2 months agoWork of pure human soul (and pure human sweat, and pure human tears)imagemessage-square60arrow-up1411arrow-down140
arrow-up1371arrow-down1imageWork of pure human soul (and pure human sweat, and pure human tears)@Phoenix3875 to Programmer [email protected] • 2 months agomessage-square60
minus-square@bi_tuxlink6•2 months agoyou don’t even need a supported gpu, I run ollama on my rx 6700 xt
minus-square@[email protected]linkfedilink3•2 months agoYou don’t even need a GPU, i can run Ollama Open-WebUI on my CPU with an 8B model fast af
minus-square@bi_tuxlink2•2 months agoI tried it with my cpu (with llama 3.0 7B), but unfortunately it ran really slow (I got a ryzen 5700x)
minus-square@tomjugglerlink2•2 months agoI ran it on my dual core celeron and… just kidding try the mini llama 1B. I’m in the same boat with Ryzen 5000 something cpu
minus-square@[email protected]linkfedilink2•2 months agoI have the same gpu my friend. I was trying to say that you won’t be able to run ROCm on some Radeon HD xy from 2008 :D
you don’t even need a supported gpu, I run ollama on my rx 6700 xt
You don’t even need a GPU, i can run Ollama Open-WebUI on my CPU with an 8B model fast af
I tried it with my cpu (with llama 3.0 7B), but unfortunately it ran really slow (I got a ryzen 5700x)
I ran it on my dual core celeron and… just kidding try the mini llama 1B. I’m in the same boat with Ryzen 5000 something cpu
I have the same gpu my friend. I was trying to say that you won’t be able to run ROCm on some Radeon HD xy from 2008 :D