• @[email protected]OP
    link
    fedilink
    English
    8
    edit-2
    1 year ago

    lollms-webui is the jankiest of the images, but that one’s newish to the scene and I’m working with the dev a bit to get it nicer (main current problem is the requirement for CLI prompts which he’ll be removing) Koboldcpp and text-gen are in a good place though, happy with how those are running

  • ffhein
    link
    English
    51 year ago

    Awesome work! Going to try out koboldcpp right away. Currently running llama.cpp in docker on my workstation because it would be such a mess to get cuda toolkit installed natively…

    Out of curiosity, isn’t conda a bit redundant in docker since it already is an isolated environment?

    • @[email protected]OP
      link
      fedilink
      English
      2
      edit-2
      1 year ago

      Yes that’s a good comment for an FAQ cause I get it a lot and it’s a very good question haha. The reason I use it is for image size, the base nvidia devel image is needed for a lot of compilation during python package installation and is huge, so instead I use conda, transfer it to the nvidia-runtime image which is… also pretty big, but it saves several GB of space so it’s a worthwhile hack :)

      but yes avoiding CUDA messes on my bare machine is definitely my biggest motivation

      • ffhein
        link
        English
        11 year ago

        Ah, nice.

        Btw. perhaps you’d like to add:

        build: .

        to docker-compose.yml so you can just write “docker-compose build” instead of having to do it with a separate docker command. I would submit a PR for it but I have made a bunch of other changes to that file so it’s probably faster if you do it.

  • @[email protected]
    link
    fedilink
    English
    31 year ago

    I would love to have some GUI with optional vector database support that I could feed my docs into.

    • @Falcon
      link
      English
      111 months ago

      You want H2OGPT or just use Langchain with CLI