* ## PR: Add LM Studio Provider Support ### Summary Adds comprehensive LM Studio integration to openclaude, following the same pattern as the existing Ollama provider. LM Studio is a popular local LLM inference tool that exposes an OpenAI-compatible API. ### Changes (4 files, 672 insertions) **New Files:** - `lmstudio_provider.py` (377 lines) - Full provider implementation with: - Health check functions (`check_lmstudio_running`) - Model listing (`list_lmstudio_models`) - Chat completion (`lmstudio_chat`) - Streaming support (`lmstudio_chat_stream`) - Comprehensive docstring with setup instructions, troubleshooting, and model recommendations - `test_lmstudio_provider.py` (227 lines) - Complete test suite with 12 passing tests covering: - API URL construction - Server health checks - Model listing - Chat completion functionality **Modified Files:** - `docs/quick-start-mac-linux.md` (+34 lines) - Added Option D: LM Studio with setup instructions and troubleshooting - `docs/quick-start-windows.md` (+34 lines) - Added Option D: LM Studio with PowerShell syntax and troubleshooting ### Key Features - No API key required (local inference) - Default port: 1234 (LM Studio's standard) - OpenAI-compatible API integration - Consistent with existing provider patterns (Ollama, Atomic Chat) - All tests passing (12/12) ### Usage ```bash export CLAUDE_CODE_USE_OPENAI=1 export OPENAI_BASE_URL=http://localhost:1234/v1 export OPENAI_MODEL=your-model-name openclaude ``` * made pr as doc only pr for lm studio * LM studio recent ui changes fixes in doc
2.6 KiB
2.6 KiB
OpenClaude Quick Start for Windows
This guide uses Windows PowerShell.
1. Install Node.js
Install Node.js 20 or newer from:
https://nodejs.org/
Then open PowerShell and check it:
node --version
npm --version
2. Install OpenClaude
npm install -g @gitlawb/openclaude
3. Pick One Provider
Option A: OpenAI
Replace sk-your-key-here with your real key.
$env:CLAUDE_CODE_USE_OPENAI="1"
$env:OPENAI_API_KEY="sk-your-key-here"
$env:OPENAI_MODEL="gpt-4o"
openclaude
Option B: DeepSeek
$env:CLAUDE_CODE_USE_OPENAI="1"
$env:OPENAI_API_KEY="sk-your-key-here"
$env:OPENAI_BASE_URL="https://api.deepseek.com/v1"
$env:OPENAI_MODEL="deepseek-chat"
openclaude
Option C: Ollama
Install Ollama first from:
https://ollama.com/download/windows
Then run:
ollama pull llama3.1:8b
$env:CLAUDE_CODE_USE_OPENAI="1"
$env:OPENAI_BASE_URL="http://localhost:11434/v1"
$env:OPENAI_MODEL="llama3.1:8b"
openclaude
No API key is needed for Ollama local models.
Option D: LM Studio
Install LM Studio first from:
https://lmstudio.ai/
Then in LM Studio:
- Download a model (e.g., Llama 3.1 8B, Mistral 7B)
- Go to the "Developer" tab
- Select your model and enable the server via the toggle
Then run:
$env:CLAUDE_CODE_USE_OPENAI="1"
$env:OPENAI_BASE_URL="http://localhost:1234/v1"
$env:OPENAI_MODEL="your-model-name"
# $env:OPENAI_API_KEY="lmstudio" # optional: some users need a dummy key
openclaude
Replace your-model-name with the model name shown in LM Studio.
No API key is needed for LM Studio local models (but uncomment the OPENAI_API_KEY line if you hit auth errors).
4. If openclaude Is Not Found
Close PowerShell, open a new one, and try again:
openclaude
5. If Your Provider Fails
Check the basics:
For OpenAI or DeepSeek
- make sure the key is real
- make sure you copied it fully
For Ollama
- make sure Ollama is installed
- make sure Ollama is running
- make sure the model was pulled successfully
For LM Studio
- make sure LM Studio is installed
- make sure LM Studio is running
- make sure the server is enabled (toggle on in the "Developer" tab)
- make sure a model is loaded in LM Studio
- make sure the model name matches what you set in
OPENAI_MODEL
6. Updating OpenClaude
npm install -g @gitlawb/openclaude@latest
7. Uninstalling OpenClaude
npm uninstall -g @gitlawb/openclaude
Need Advanced Setup?
Use: