Files
orcs-code/docs/quick-start-windows.md
JATMN ff2a380723 Add DeepSeek V4 flash/pro support and DeepSeek thinking compatibility (#877)
* Add DeepSeek V4 support and thinking compatibility

* Fix DeepSeek profile persistence regression

* Align multi-model handling with openai-multi-model
2026-04-25 02:29:46 +08:00

146 lines
2.7 KiB
Markdown

# OpenClaude Quick Start for Windows
This guide uses Windows PowerShell.
## 1. Install Node.js
Install Node.js 20 or newer from:
- `https://nodejs.org/`
Then open PowerShell and check it:
```powershell
node --version
npm --version
```
## 2. Install OpenClaude
```powershell
npm install -g @gitlawb/openclaude
```
## 3. Pick One Provider
### Option A: OpenAI
Replace `sk-your-key-here` with your real key.
```powershell
$env:CLAUDE_CODE_USE_OPENAI="1"
$env:OPENAI_API_KEY="sk-your-key-here"
$env:OPENAI_MODEL="gpt-4o"
openclaude
```
### Option B: DeepSeek
```powershell
$env:CLAUDE_CODE_USE_OPENAI="1"
$env:OPENAI_API_KEY="sk-your-key-here"
$env:OPENAI_BASE_URL="https://api.deepseek.com/v1"
$env:OPENAI_MODEL="deepseek-v4-flash"
openclaude
```
Use `deepseek-v4-pro` when you want the stronger model. `deepseek-chat` and `deepseek-reasoner` still work as DeepSeek's legacy API aliases.
### Option C: Ollama
Install Ollama first from:
- `https://ollama.com/download/windows`
Then run:
```powershell
ollama pull llama3.1:8b
$env:CLAUDE_CODE_USE_OPENAI="1"
$env:OPENAI_BASE_URL="http://localhost:11434/v1"
$env:OPENAI_MODEL="llama3.1:8b"
openclaude
```
No API key is needed for Ollama local models.
### Option D: LM Studio
Install LM Studio first from:
- `https://lmstudio.ai/`
Then in LM Studio:
1. Download a model (e.g., Llama 3.1 8B, Mistral 7B)
2. Go to the "Developer" tab
3. Select your model and enable the server via the toggle
Then run:
```powershell
$env:CLAUDE_CODE_USE_OPENAI="1"
$env:OPENAI_BASE_URL="http://localhost:1234/v1"
$env:OPENAI_MODEL="your-model-name"
# $env:OPENAI_API_KEY="lmstudio" # optional: some users need a dummy key
openclaude
```
Replace `your-model-name` with the model name shown in LM Studio.
No API key is needed for LM Studio local models (but uncomment the `OPENAI_API_KEY` line if you hit auth errors).
## 4. If `openclaude` Is Not Found
Close PowerShell, open a new one, and try again:
```powershell
openclaude
```
## 5. If Your Provider Fails
Check the basics:
### For OpenAI or DeepSeek
- make sure the key is real
- make sure you copied it fully
### For Ollama
- make sure Ollama is installed
- make sure Ollama is running
- make sure the model was pulled successfully
### For LM Studio
- make sure LM Studio is installed
- make sure LM Studio is running
- make sure the server is enabled (toggle on in the "Developer" tab)
- make sure a model is loaded in LM Studio
- make sure the model name matches what you set in `OPENAI_MODEL`
## 6. Updating OpenClaude
```powershell
npm install -g @gitlawb/openclaude@latest
```
## 7. Uninstalling OpenClaude
```powershell
npm uninstall -g @gitlawb/openclaude
```
## Need Advanced Setup?
Use:
- [Advanced Setup](advanced-setup.md)