• @[email protected]
      link
      fedilink
      11 day ago

      As someone who is rather new to the topic: I have a GPU with 16 GB VRAM and only recently installed Ollama. Which size should I use for Deepseek R1?🤔

      • @kyoji
        link
        219 hours ago

        I also have 16gb vram and the 32b version runs ok. Anything larger would take too long I think

      • Lurker
        link
        fedilink
        21 day ago

        You can try from lowest to bigger. You probably can run biggest too but it will be slow.