Add DeepSeek V4 flash/pro support and DeepSeek thinking compatibility (#877)
* Add DeepSeek V4 support and thinking compatibility * Fix DeepSeek profile persistence regression * Align multi-model handling with openai-multi-model
This commit is contained in:
@@ -41,11 +41,13 @@ openclaude
|
||||
export CLAUDE_CODE_USE_OPENAI=1
|
||||
export OPENAI_API_KEY=sk-your-key-here
|
||||
export OPENAI_BASE_URL=https://api.deepseek.com/v1
|
||||
export OPENAI_MODEL=deepseek-chat
|
||||
export OPENAI_MODEL=deepseek-v4-flash
|
||||
|
||||
openclaude
|
||||
```
|
||||
|
||||
Use `deepseek-v4-pro` when you want the stronger model. `deepseek-chat` and `deepseek-reasoner` still work as DeepSeek's legacy API aliases.
|
||||
|
||||
### Option C: Ollama
|
||||
|
||||
Install Ollama first from:
|
||||
|
||||
Reference in New Issue
Block a user