Hey everyone!

I don’t think I’ve shared this one before, so allow me to introduce you to ‘LM Studio’ - a new application that is tailored to LLM developers and enthusiasts.

Check it out!


With LM Studio, you can …

🤖 - Run LLMs on your laptop, entirely offline

👾 - Use models through the in-app Chat UI or an OpenAI compatible local server

📂 - Download any compatible model files from HuggingFace 🤗 repositories

🔭 - Discover new & noteworthy LLMs in the app’s home page LM Studio supports any ggml Llama, MPT, and StarCoder model on Hugging Face (Llama 2, Orca, Vicuna, Nous Hermes, WizardCoder, MPT, etc.)

Minimum requirements: M1/M2 Mac, or a Windows PC with a processor that supports AVX2. Linux is under development.

Made possible thanks to the llama.cpp project.

We are expanding our team. See our careers page.


Love seeing these new tools come out! Especially with the new gguf format being widely adopted.

The regularly updated and curated list of new LLM releases they provide through this platform is enough for me to keep it installed.

I’ll be tinkering plenty when I have the time this week. I’ll be sure to let everyone know how it goes! In the meantime, if you do end up giving LM Studio a try - let us know your thoughts and experience with it in the comments below.

  • @[email protected]
    link
    fedilink
    English
    11 year ago

    I’ve been using LM Studio for a while now, and it’s really come along with some great features such as per-chat configs. Performance is capable on a base 14" M1 Pro MacBook Pro.