From f3ab727ec2178ef658242a8623e525f8bef94c2c Mon Sep 17 00:00:00 2001 From: Preetham Date: Fri, 3 Apr 2026 10:15:57 +0530 Subject: [PATCH] Added LM Studio provider setup guide (#227) * ## PR: Add LM Studio Provider Support ### Summary Adds comprehensive LM Studio integration to openclaude, following the same pattern as the existing Ollama provider. LM Studio is a popular local LLM inference tool that exposes an OpenAI-compatible API. ### Changes (4 files, 672 insertions) **New Files:** - `lmstudio_provider.py` (377 lines) - Full provider implementation with: - Health check functions (`check_lmstudio_running`) - Model listing (`list_lmstudio_models`) - Chat completion (`lmstudio_chat`) - Streaming support (`lmstudio_chat_stream`) - Comprehensive docstring with setup instructions, troubleshooting, and model recommendations - `test_lmstudio_provider.py` (227 lines) - Complete test suite with 12 passing tests covering: - API URL construction - Server health checks - Model listing - Chat completion functionality **Modified Files:** - `docs/quick-start-mac-linux.md` (+34 lines) - Added Option D: LM Studio with setup instructions and troubleshooting - `docs/quick-start-windows.md` (+34 lines) - Added Option D: LM Studio with PowerShell syntax and troubleshooting ### Key Features - No API key required (local inference) - Default port: 1234 (LM Studio's standard) - OpenAI-compatible API integration - Consistent with existing provider patterns (Ollama, Atomic Chat) - All tests passing (12/12) ### Usage ```bash export CLAUDE_CODE_USE_OPENAI=1 export OPENAI_BASE_URL=http://localhost:1234/v1 export OPENAI_MODEL=your-model-name openclaude ``` * made pr as doc only pr for lm studio * LM studio recent ui changes fixes in doc --- docs/quick-start-mac-linux.md | 35 +++++++++++++++++++++++++++++++++++ docs/quick-start-windows.md | 35 +++++++++++++++++++++++++++++++++++ 2 files changed, 70 insertions(+) diff --git a/docs/quick-start-mac-linux.md b/docs/quick-start-mac-linux.md index 7e8cb96e..133c713a 100644 --- a/docs/quick-start-mac-linux.md +++ b/docs/quick-start-mac-linux.md @@ -66,6 +66,33 @@ openclaude No API key is needed for Ollama local models. +### Option D: LM Studio + +Install LM Studio first from: + +- `https://lmstudio.ai/` + +Then in LM Studio: + +1. Download a model (e.g., Llama 3.1 8B, Mistral 7B) +2. Go to the "Developer" tab +3. Select your model and enable the server via the toggle + +Then run: + +```bash +export CLAUDE_CODE_USE_OPENAI=1 +export OPENAI_BASE_URL=http://localhost:1234/v1 +export OPENAI_MODEL=your-model-name +# export OPENAI_API_KEY=lmstudio # optional: some users need a dummy key + +openclaude +``` + +Replace `your-model-name` with the model name shown in LM Studio. + +No API key is needed for LM Studio local models (but uncomment the `OPENAI_API_KEY` line if you hit auth errors). + ## 4. If `openclaude` Is Not Found Close the terminal, open a new one, and try again: @@ -89,6 +116,14 @@ Check the basics: - make sure Ollama is running - make sure the model was pulled successfully +### For LM Studio + +- make sure LM Studio is installed +- make sure LM Studio is running +- make sure the server is enabled (toggle on in the "Developer" tab) +- make sure a model is loaded in LM Studio +- make sure the model name matches what you set in `OPENAI_MODEL` + ## 6. Updating OpenClaude ```bash diff --git a/docs/quick-start-windows.md b/docs/quick-start-windows.md index dfac8782..5593fc52 100644 --- a/docs/quick-start-windows.md +++ b/docs/quick-start-windows.md @@ -66,6 +66,33 @@ openclaude No API key is needed for Ollama local models. +### Option D: LM Studio + +Install LM Studio first from: + +- `https://lmstudio.ai/` + +Then in LM Studio: + +1. Download a model (e.g., Llama 3.1 8B, Mistral 7B) +2. Go to the "Developer" tab +3. Select your model and enable the server via the toggle + +Then run: + +```powershell +$env:CLAUDE_CODE_USE_OPENAI="1" +$env:OPENAI_BASE_URL="http://localhost:1234/v1" +$env:OPENAI_MODEL="your-model-name" +# $env:OPENAI_API_KEY="lmstudio" # optional: some users need a dummy key + +openclaude +``` + +Replace `your-model-name` with the model name shown in LM Studio. + +No API key is needed for LM Studio local models (but uncomment the `OPENAI_API_KEY` line if you hit auth errors). + ## 4. If `openclaude` Is Not Found Close PowerShell, open a new one, and try again: @@ -89,6 +116,14 @@ Check the basics: - make sure Ollama is running - make sure the model was pulled successfully +### For LM Studio + +- make sure LM Studio is installed +- make sure LM Studio is running +- make sure the server is enabled (toggle on in the "Developer" tab) +- make sure a model is loaded in LM Studio +- make sure the model name matches what you set in `OPENAI_MODEL` + ## 6. Updating OpenClaude ```powershell