• Possibly linux
    link
    fedilink
    English
    97 months ago

    That’s not how that works. Also we have our own. Its called ollama

      • Possibly linux
        link
        fedilink
        English
        2
        edit-2
        7 months ago

        On which platform?

        Basically you need three things. You need the ollama software, a LLM model such as mistral and a front end like openwebui.

        Ollama is pretty much just a daemon that has a web api apps can use to query LLMs.

        • @[email protected]
          link
          fedilink
          English
          17 months ago

          Linux, specifically nobara (a gaming focused fedora distro) for me

          Do you have any guides you would recommend?

          • Possibly linux
            link
            fedilink
            English
            27 months ago

            Actually it is pretty easy. You can either run it in a VM or you can run it in podman.

            For a VM, you could install virtual manager and then Debian. From there you need to of course do the normal setup of SSH and disable the root login.

            Once you have a Debian VM you can install ollama and pull down llava and mistral. Make sure you give the VM plenty of resources including almost all cores and 8gb of ram. To setup ollama you can follow the guides

            Once you have ollama working you can then setup openwebui. I had to use network: host with the ollama environment variable pointed to 127.0.0.1 (loopback)

            Once that’s done you should be able to access it at the IP of the VM port 8080. The first time it runs you need to click create account.

            Keep in mind that a blank screen means that it can’t reach ollama.

            The alternative setup to this would be podman. You theoretically could create a ollama container and a openwebui container. They would need to be attached to the same internal network. It probably would be simpler to run but I haven’t tried it.