• @Widowmaker_Best_Girl
    link
    English
    9
    edit-2
    1 year ago

    Well, I was happily paying them to lewd up the chatbots, but then they emailed me telling me to stop. I guess they don’t want my money.

    • @[email protected]
      link
      fedilink
      English
      4
      edit-2
      1 year ago

      I’ve heard it’s not as good, but I think NovelAI works pretty well, and explicitly allows that.

      • @Widowmaker_Best_Girl
        link
        English
        21 year ago

        I actually started my journey into Lewd AI stuff with NovelAI. I stopped using it after awhile because I like chatbot rp specifically, not just something that will finish a story for me. Using Silly Tavern to try and emulate the NovelAI models into acting like chat bots just shows how not good they are at that.

        • @[email protected]
          link
          fedilink
          English
          21 year ago

          Makes sense. In that case I guess your next best option is probably to buy or rent hardware to run the local models that are suitable for chat rp.

          • @Widowmaker_Best_Girl
            link
            English
            21 year ago

            I have definitely been considering it. My current hardware gives me about an 80 second delay when I run an llm locally.

            • @[email protected]
              link
              fedilink
              English
              2
              edit-2
              1 year ago

              Same, at least for anything but the tiny ones that will fit in my limited vram. Hoping a gpu that’s actually good for LLM will come out in the next few years that’s not 15k and made for servers.