I’ve just discovered OmniGPT that seems to be a chat where you can interact with different LLM (Claude, GPT-4, Llama, Gemini, etc.) and costs $16/month (it was $7/month until a week ago 🤦‍♂️). I’ve read on a Reddit post that it uses the APIs of all the provider that is a thing that can be done for free using a personal account (since the API limit seems to be high). Do you know something like OminGPT that can be self hosted that uses users API keys?

    • @peregusOP
      link
      English
      09 months ago

      That seems to need the AI model to be local

      • Scrubbles
        link
        fedilink
        English
        12
        edit-2
        9 months ago

        Oh yes, maybe I misunderstood what you were asking. This is the server that will host the models and the API, it also has a nice interface.

        So by local I mean local to the server, you can run it somewhere else and not put the models on your local computer, but yes the server will need them.

        You can then use other apps to connect with it. That’s what I consider self hosting, hosting the whole thing soup to nuts

        • @peregusOP
          link
          English
          -19 months ago

          What I’m looking for is a frontend that uses GTP-4, Gemini and other AI engine with their respective APIs keys.

          • paraphrand
            link
            English
            239 months ago

            Yeah, using “self hosted” in your title is misleading.

            • @peregusOP
              link
              English
              -10
              edit-2
              9 months ago

              But I will…self host this service! And beside the title, I’ve written a post with a description of what I’m looking for.

              • Nyfure
                link
                fedilink
                8
                edit-2
                9 months ago

                you want a frontend, not the “service” itself.
                Under “service” i usually understand the main logic part of something. In this case the LLM-processing itself.
                Thats probably where the confusion is coming from here.

                • @peregusOP
                  link
                  English
                  -119 months ago

                  It’s still self hosted! 🤷🏻‍♂️

  • @[email protected]
    link
    fedilink
    English
    69 months ago

    I believe Librechat would achieve your goals, but you’d need a PC or server to host it in. It supports all major API.

    Kobold light might with as well, and doesn’t need to be hosted locally, but I don’t think it supports Claude haiku specifically, for unknown reasons.

    Additionally, the official Claude api workshop is pretty good on desktop, but it only supports Claude.