I’m looking for a resource efficient AI model for text generation (math, coding etc.) that will work with LocalAI. Which model should I use? I don’t want it to use more than 1-3 GB RAM. I’ll run it on a vps to use with Nextcloud.

Edit: I’m use Mistral AI and Groq.com instead of selfhosting the models. They both have generous free plan.

  • David From Space
    link
    fedilink
    English
    630 days ago

    try pfizer/poppy-lrud-normal-128, run it straight offff your neural chip and feed it 1 GB RAM you’ll be gud2go