Should be good at conversations and creative, it’ll be for worldbuilding

Best if uncensored as I prefer that over it kicking in when I least want it

I’m fine with those roleplaying models as long as they can actually give me ideas and talk to be logically

  • @Smokeydope
    link
    English
    21 day ago

    At some point you’ll run out of vram memory on the GPU. You make it slower by offloading some memory layers to make room for more context.

    • @[email protected]
      link
      fedilink
      English
      11 day ago

      Yes, but if he’s world building, a larger, slower model might just be an acceptable compromise.

      I was getting oom errors doing speech to text on my 4070ti. I know (now) that I should have for for the 3090ti. Such is life.