• 𝕸𝖔𝖘𝖘
    link
    fedilink
    English
    17
    edit-2
    2 months ago

    This is a really great use of LLM! Seriously great job! Once it’s fully self-hostable (including the LLM model), I will absolutely find it space on the home server. Maybe using Rupeshs fastdcpu as the model and generation backend could work. I don’t remember what his license is, though.

    Edit: added link.

    • Cr4yfishOP
      link
      102 months ago

      Thanks! I’m already eyeing ollama for this.