Im using Ollama on my server with the WebUI. It has no GPU so its not quick to reply but not too slow either.

Im thinking about removing the VM as i just dont use it, are there any good uses or integrations into other apps that might convince me to keep it?

  • minnix
    link
    fedilink
    English
    -22 months ago

    Ollama without a GPU is pretty useless unless you’re using with Apple silicon. I’d just get rid of it until you get a GPU.

    • Possibly linux
      link
      fedilink
      English
      12 months ago

      I have never tested in on Apple silicon but it works fine on my laptop

          • minnix
            link
            fedilink
            English
            2
            edit-2
            2 months ago

            CPU is only one factor regarding specs, a small one at that. What kind of t/s performance are you getting with a standard 13B model?

            • Possibly linux
              link
              fedilink
              English
              12 months ago

              I don’t have enough ram to run a 13b. I just stick to Mistral 7b and it works fine.