They support Claude, ChatGPT, Gemini, HuggingChat, and Mistral.

  • @[email protected]
    link
    fedilink
    15 hours ago

    Last time I tried using a local llm (about a year ago) it generated only a couple words per second and the answers were barely relevant. Also I don’t see how a local llm can fulfill the glorified search engine role that people use llms for.

    • @[email protected]
      link
      fedilink
      English
      15 hours ago

      They’re fast and high quality now. ChatGPT is the best, but local llms are great, even with 10gb of vram.