• @froh42
    link
    16 hours ago

    Don’t you need a fast GPU to do so?

    • @[email protected]
      link
      fedilink
      1
      edit-2
      3 hours ago

      You would benefit from it with some GPU offloading, this would considerably accelerate the speed of the answers. But you only need enough RAM to load the model at the bare minimum.