• @bi_tux
      link
      62 months ago

      you don’t even need a supported gpu, I run ollama on my rx 6700 xt

        • @bi_tux
          link
          22 months ago

          I tried it with my cpu (with llama 3.0 7B), but unfortunately it ran really slow (I got a ryzen 5700x)

          • @tomjuggler
            link
            22 months ago

            I ran it on my dual core celeron and… just kidding try the mini llama 1B. I’m in the same boat with Ryzen 5000 something cpu

      • @[email protected]
        link
        fedilink
        22 months ago

        I have the same gpu my friend. I was trying to say that you won’t be able to run ROCm on some Radeon HD xy from 2008 :D