* ## PR: Add LM Studio Provider Support
### Summary
Adds comprehensive LM Studio integration to openclaude, following the same pattern as the existing Ollama provider. LM Studio is a popular local LLM inference tool that exposes an OpenAI-compatible API.
### Changes (4 files, 672 insertions)
**New Files:**
- `lmstudio_provider.py` (377 lines) - Full provider implementation with:
- Health check functions (`check_lmstudio_running`)
- Model listing (`list_lmstudio_models`)
- Chat completion (`lmstudio_chat`)
- Streaming support (`lmstudio_chat_stream`)
- Comprehensive docstring with setup instructions, troubleshooting, and model recommendations
- `test_lmstudio_provider.py` (227 lines) - Complete test suite with 12 passing tests covering:
- API URL construction
- Server health checks
- Model listing
- Chat completion functionality
**Modified Files:**
- `docs/quick-start-mac-linux.md` (+34 lines) - Added Option D: LM Studio with setup instructions and troubleshooting
- `docs/quick-start-windows.md` (+34 lines) - Added Option D: LM Studio with PowerShell syntax and troubleshooting
### Key Features
- No API key required (local inference)
- Default port: 1234 (LM Studio's standard)
- OpenAI-compatible API integration
- Consistent with existing provider patterns (Ollama, Atomic Chat)
- All tests passing (12/12)
### Usage
```bash
export CLAUDE_CODE_USE_OPENAI=1
export OPENAI_BASE_URL=http://localhost:1234/v1
export OPENAI_MODEL=your-model-name
openclaude
```
* made pr as doc only pr for lm studio
* LM studio recent ui changes fixes in doc