Documentation
Running LLMs Locally
User Interface
Advanced
Command Line Interface - lms
API
About LM Studio
LM Studio is a desktop app for developing and experimenting with LLMs locally on your computer.
Key functionality
Head over to the Downloads page and download an installer for your operating system.
LM Studio is available for macOS, Windows, and Linux.
LM Studio generally supports Apple Silicon Macs, x64/ARM64 Windows PCs, and x64 Linux PCs.
Consult the System Requirements page for more detailed information.
LM Studio supports running LLMs on Mac, Windows, and Linux using llama.cpp
.
On Apple Silicon Macs, LM Studio also supports running LLMs using Apple's MLX
.
To install or manage LM Runtimes, press โ
Shift
R
on Mac or Ctrl
Shift
R
on Windows/Linux.
Llama
, Phi
, or DeepSeek R1
on your computerTo run an LLM on your computer you first need to download the model weights.
You can do this right within LM Studio! See Download an LLM for guidance.
You can attach documents to your chat messages and interact with them entirely offline, also known as "RAG".
Read more about how to use this feature in the Chat with Documents guide.
LM Studio provides a REST API that you can use to interact with your local models from your own apps and scripts.
Join the LM Studio community on Discord to ask questions, share knowledge, and get help from other users and the LM Studio team.