• z3rOR0ne
      link
      fedilink
      English
      09 months ago

      Nice. Thanks. I’ll save this post in case I use ollama in the future. Right now I use a codellama model and a mythomax model, but am not running them via a localhost server, just outputted in the terminal or LMStudio.

      This looks interesting though. Thanks!