I’ve been using llama.cpp, gpt-llama and chatbot-ui for a while now, and I’m very happy with it. However, I’m now looking into a more stable setup using only GPU. Is this llama.cpp still still a good candidate for that?

    • @[email protected]
      link
      fedilink
      English
      31 year ago

      Yea it’s called Text Generation web UI. If you check out the Ooba Booga git, it goes into good details. From what I can tell it’s based on the automatic1111 UI for stable diffusion.

      • @[email protected]
        link
        fedilink
        English
        21 year ago

        It’s using Gradio, which is what auto1111 also uses. Both of these are pretty heavy modifications/extensions that do a lot to push Gradio to it’s limits, but that’s package being used in both. Note, it also has an api (checkout the --api flag I believe), and depending on what you want to do there’s various UIs that can hook into the Text Gen Web UI (oobabooga) API in various ways.

    • @[email protected]
      link
      fedilink
      English
      11 year ago

      Personally, I have nothing but issues with Oogas ui, so I connect Silly Tavern to it or KoboldCPP. Works great