Document the new `ollama launch openclaude` command as a shortcut
for running OpenClaude through a local Ollama instance. This is
now supported in Ollama's launch system and handles all environment
variable setup automatically — no manual env vars needed.
Changes:
- README.md: Add "Using Ollama's launch command" section after the
manual Ollama env var setup, and update the provider table to
list `ollama launch` as a setup path for Ollama
- docs/advanced-setup.md: Add `ollama launch` as the recommended
method at the top of the Ollama section, with the manual env var
approach kept below as an alternative
* docs: add LiteLLM proxy setup guide
Document the setup process for LiteLLM and its integration with OpenClaude, including prerequisites, configuration, and troubleshooting steps.
* Revise LiteLLM setup steps for Adocs: fix /provider walkthrough to match actual OpenAI-compatible flowPI key and model
Updated setup instructions for LiteLLM provider configuration.
* docs: fix sub-bullet formatting in /provider steps
* docs: clarify key scope in troubleshooting (LiteLLM proxy process env)
Clarified instruction for upstream provider error regarding API key.
* ## PR: Add LM Studio Provider Support
### Summary
Adds comprehensive LM Studio integration to openclaude, following the same pattern as the existing Ollama provider. LM Studio is a popular local LLM inference tool that exposes an OpenAI-compatible API.
### Changes (4 files, 672 insertions)
**New Files:**
- `lmstudio_provider.py` (377 lines) - Full provider implementation with:
- Health check functions (`check_lmstudio_running`)
- Model listing (`list_lmstudio_models`)
- Chat completion (`lmstudio_chat`)
- Streaming support (`lmstudio_chat_stream`)
- Comprehensive docstring with setup instructions, troubleshooting, and model recommendations
- `test_lmstudio_provider.py` (227 lines) - Complete test suite with 12 passing tests covering:
- API URL construction
- Server health checks
- Model listing
- Chat completion functionality
**Modified Files:**
- `docs/quick-start-mac-linux.md` (+34 lines) - Added Option D: LM Studio with setup instructions and troubleshooting
- `docs/quick-start-windows.md` (+34 lines) - Added Option D: LM Studio with PowerShell syntax and troubleshooting
### Key Features
- No API key required (local inference)
- Default port: 1234 (LM Studio's standard)
- OpenAI-compatible API integration
- Consistent with existing provider patterns (Ollama, Atomic Chat)
- All tests passing (12/12)
### Usage
```bash
export CLAUDE_CODE_USE_OPENAI=1
export OPENAI_BASE_URL=http://localhost:1234/v1
export OPENAI_MODEL=your-model-name
openclaude
```
* made pr as doc only pr for lm studio
* LM studio recent ui changes fixes in doc
Split the setup documentation into a simple beginner path and a separate advanced path. Add OS-specific quick starts for Windows and macOS/Linux so non-technical users can copy and paste the right commands without sorting through Bun and source-build instructions.