diff --git a/README.md b/README.md index 1e2bbe6e..52ecc7b8 100644 --- a/README.md +++ b/README.md @@ -92,6 +92,16 @@ $env:OPENAI_MODEL="qwen2.5-coder:7b" openclaude ``` +### Using Ollama's launch command + +If you have [Ollama](https://ollama.com) installed, you can skip the env var setup entirely: + +```bash +ollama launch openclaude --model qwen2.5-coder:7b +``` + +This automatically sets `ANTHROPIC_BASE_URL`, model routing, and auth so all API traffic goes through your local Ollama instance. Works with any model you have pulled — local or cloud. + ## Setup Guides Beginner-friendly guides: @@ -114,7 +124,7 @@ Advanced and source-build guides: | GitHub Models | `/onboard-github` | Interactive onboarding with saved credentials | | Codex OAuth | `/provider` | Opens ChatGPT sign-in in your browser and stores Codex credentials securely | | Codex | `/provider` | Uses existing Codex CLI auth, OpenClaude secure storage, or env credentials | -| Ollama | `/provider` or env vars | Local inference with no API key | +| Ollama | `/provider`, env vars, or `ollama launch` | Local inference with no API key | | Atomic Chat | advanced setup | Local Apple Silicon backend | | Bedrock / Vertex / Foundry | env vars | Additional provider integrations for supported environments | diff --git a/docs/advanced-setup.md b/docs/advanced-setup.md index 75c401ee..291aee7d 100644 --- a/docs/advanced-setup.md +++ b/docs/advanced-setup.md @@ -84,6 +84,16 @@ OpenRouter model availability changes over time. If a model stops working, try a ### Ollama +Using `ollama launch` (recommended if you have Ollama installed): + +```bash +ollama launch openclaude --model llama3.3:70b +``` + +This handles all environment setup automatically — no env vars needed. Works with any local or cloud model available in your Ollama instance. + +Using environment variables manually: + ```bash ollama pull llama3.3:70b