Files
orcs-code/docs/quick-start-mac-linux.md
JATMN ff2a380723 Add DeepSeek V4 flash/pro support and DeepSeek thinking compatibility (#877)
* Add DeepSeek V4 support and thinking compatibility

* Fix DeepSeek profile persistence regression

* Align multi-model handling with openai-multi-model
2026-04-25 02:29:46 +08:00

2.7 KiB

OpenClaude Quick Start for macOS and Linux

This guide uses a standard shell such as Terminal, iTerm, bash, or zsh.

1. Install Node.js

Install Node.js 20 or newer from:

  • https://nodejs.org/

Then check it:

node --version
npm --version

2. Install OpenClaude

npm install -g @gitlawb/openclaude

3. Pick One Provider

Option A: OpenAI

Replace sk-your-key-here with your real key.

export CLAUDE_CODE_USE_OPENAI=1
export OPENAI_API_KEY=sk-your-key-here
export OPENAI_MODEL=gpt-4o

openclaude

Option B: DeepSeek

export CLAUDE_CODE_USE_OPENAI=1
export OPENAI_API_KEY=sk-your-key-here
export OPENAI_BASE_URL=https://api.deepseek.com/v1
export OPENAI_MODEL=deepseek-v4-flash

openclaude

Use deepseek-v4-pro when you want the stronger model. deepseek-chat and deepseek-reasoner still work as DeepSeek's legacy API aliases.

Option C: Ollama

Install Ollama first from:

  • https://ollama.com/download

Then run:

ollama pull llama3.1:8b

export CLAUDE_CODE_USE_OPENAI=1
export OPENAI_BASE_URL=http://localhost:11434/v1
export OPENAI_MODEL=llama3.1:8b

openclaude

No API key is needed for Ollama local models.

Option D: LM Studio

Install LM Studio first from:

  • https://lmstudio.ai/

Then in LM Studio:

  1. Download a model (e.g., Llama 3.1 8B, Mistral 7B)
  2. Go to the "Developer" tab
  3. Select your model and enable the server via the toggle

Then run:

export CLAUDE_CODE_USE_OPENAI=1
export OPENAI_BASE_URL=http://localhost:1234/v1
export OPENAI_MODEL=your-model-name
# export OPENAI_API_KEY=lmstudio  # optional: some users need a dummy key

openclaude

Replace your-model-name with the model name shown in LM Studio.

No API key is needed for LM Studio local models (but uncomment the OPENAI_API_KEY line if you hit auth errors).

4. If openclaude Is Not Found

Close the terminal, open a new one, and try again:

openclaude

5. If Your Provider Fails

Check the basics:

For OpenAI or DeepSeek

  • make sure the key is real
  • make sure you copied it fully

For Ollama

  • make sure Ollama is installed
  • make sure Ollama is running
  • make sure the model was pulled successfully

For LM Studio

  • make sure LM Studio is installed
  • make sure LM Studio is running
  • make sure the server is enabled (toggle on in the "Developer" tab)
  • make sure a model is loaded in LM Studio
  • make sure the model name matches what you set in OPENAI_MODEL

6. Updating OpenClaude

npm install -g @gitlawb/openclaude@latest

7. Uninstalling OpenClaude

npm uninstall -g @gitlawb/openclaude

Need Advanced Setup?

Use: