• 𝕸𝖔𝖘𝖘
    link
    fedilink
    English
    16
    edit-2
    18 days ago

    This is a really great use of LLM! Seriously great job! Once it’s fully self-hostable (including the LLM model), I will absolutely find it space on the home server. Maybe using Rupeshs fastdcpu as the model and generation backend could work. I don’t remember what his license is, though.

    Edit: added link.

    • Cr4yfishOP
      link
      918 days ago

      Thanks! I’m already eyeing ollama for this.