* Add DeepSeek V4 support and thinking compatibility * Fix DeepSeek profile persistence regression * Align multi-model handling with openai-multi-model
2.7 KiB
2.7 KiB
OpenClaude Quick Start for Windows
This guide uses Windows PowerShell.
1. Install Node.js
Install Node.js 20 or newer from:
https://nodejs.org/
Then open PowerShell and check it:
node --version
npm --version
2. Install OpenClaude
npm install -g @gitlawb/openclaude
3. Pick One Provider
Option A: OpenAI
Replace sk-your-key-here with your real key.
$env:CLAUDE_CODE_USE_OPENAI="1"
$env:OPENAI_API_KEY="sk-your-key-here"
$env:OPENAI_MODEL="gpt-4o"
openclaude
Option B: DeepSeek
$env:CLAUDE_CODE_USE_OPENAI="1"
$env:OPENAI_API_KEY="sk-your-key-here"
$env:OPENAI_BASE_URL="https://api.deepseek.com/v1"
$env:OPENAI_MODEL="deepseek-v4-flash"
openclaude
Use deepseek-v4-pro when you want the stronger model. deepseek-chat and deepseek-reasoner still work as DeepSeek's legacy API aliases.
Option C: Ollama
Install Ollama first from:
https://ollama.com/download/windows
Then run:
ollama pull llama3.1:8b
$env:CLAUDE_CODE_USE_OPENAI="1"
$env:OPENAI_BASE_URL="http://localhost:11434/v1"
$env:OPENAI_MODEL="llama3.1:8b"
openclaude
No API key is needed for Ollama local models.
Option D: LM Studio
Install LM Studio first from:
https://lmstudio.ai/
Then in LM Studio:
- Download a model (e.g., Llama 3.1 8B, Mistral 7B)
- Go to the "Developer" tab
- Select your model and enable the server via the toggle
Then run:
$env:CLAUDE_CODE_USE_OPENAI="1"
$env:OPENAI_BASE_URL="http://localhost:1234/v1"
$env:OPENAI_MODEL="your-model-name"
# $env:OPENAI_API_KEY="lmstudio" # optional: some users need a dummy key
openclaude
Replace your-model-name with the model name shown in LM Studio.
No API key is needed for LM Studio local models (but uncomment the OPENAI_API_KEY line if you hit auth errors).
4. If openclaude Is Not Found
Close PowerShell, open a new one, and try again:
openclaude
5. If Your Provider Fails
Check the basics:
For OpenAI or DeepSeek
- make sure the key is real
- make sure you copied it fully
For Ollama
- make sure Ollama is installed
- make sure Ollama is running
- make sure the model was pulled successfully
For LM Studio
- make sure LM Studio is installed
- make sure LM Studio is running
- make sure the server is enabled (toggle on in the "Developer" tab)
- make sure a model is loaded in LM Studio
- make sure the model name matches what you set in
OPENAI_MODEL
6. Updating OpenClaude
npm install -g @gitlawb/openclaude@latest
7. Uninstalling OpenClaude
npm uninstall -g @gitlawb/openclaude
Need Advanced Setup?
Use: