• Widowmaker_Best_Girl
    link
    fedilink
    English
    arrow-up
    2
    ·
    3 years ago

    I actually started my journey into Lewd AI stuff with NovelAI. I stopped using it after awhile because I like chatbot rp specifically, not just something that will finish a story for me. Using Silly Tavern to try and emulate the NovelAI models into acting like chat bots just shows how not good they are at that.

    • chicken@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      2
      ·
      3 years ago

      Makes sense. In that case I guess your next best option is probably to buy or rent hardware to run the local models that are suitable for chat rp.

      • Widowmaker_Best_Girl
        link
        fedilink
        English
        arrow-up
        2
        ·
        3 years ago

        I have definitely been considering it. My current hardware gives me about an 80 second delay when I run an llm locally.

        • chicken@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          2
          ·
          3 years ago

          Same, at least for anything but the tiny ones that will fit in my limited vram. Hoping a gpu that’s actually good for LLM will come out in the next few years that’s not 15k and made for servers.