feat: (Extension of #175) added cross-platform system-wide environment variable setup guide for all providers (#185)
* added Instructions to env example to allow openclaude to be used system wide * added suggested .env.example changes I added the suggested .env.example changes suggested earlier within the pr thread
This commit is contained in:
149
.env.example
149
.env.example
@@ -8,6 +8,120 @@
|
|||||||
# All other sections can be left commented out.
|
# All other sections can be left commented out.
|
||||||
# =============================================================================
|
# =============================================================================
|
||||||
|
|
||||||
|
# =============================================================================
|
||||||
|
# SYSTEM-WIDE SETUP (OPTIONAL)
|
||||||
|
# =============================================================================
|
||||||
|
# Instead of using a .env file per project, you can set these variables
|
||||||
|
# system-wide so OpenClaude works from any directory on your machine.
|
||||||
|
#
|
||||||
|
# STEP 1: Pick your provider variables from the list below.
|
||||||
|
# STEP 2: Set them using the method for your OS (see further down).
|
||||||
|
#
|
||||||
|
# ── Provider variables ───────────────────────────────────────────────
|
||||||
|
#
|
||||||
|
# Option 1 — Anthropic:
|
||||||
|
# ANTHROPIC_API_KEY=sk-ant-your-key-here
|
||||||
|
# ANTHROPIC_MODEL=claude-sonnet-4-5 (optional)
|
||||||
|
# ANTHROPIC_BASE_URL=https://api.anthropic.com (optional)
|
||||||
|
#
|
||||||
|
# Option 2 — OpenAI:
|
||||||
|
# CLAUDE_CODE_USE_OPENAI=1
|
||||||
|
# OPENAI_API_KEY=sk-your-key-here
|
||||||
|
# OPENAI_MODEL=gpt-4o
|
||||||
|
# OPENAI_BASE_URL=https://api.openai.com/v1 (optional)
|
||||||
|
#
|
||||||
|
# Option 3 — Google Gemini:
|
||||||
|
# CLAUDE_CODE_USE_GEMINI=1
|
||||||
|
# GEMINI_API_KEY=your-gemini-key-here
|
||||||
|
# GEMINI_MODEL=gemini-2.0-flash
|
||||||
|
# GEMINI_BASE_URL=https://generativelanguage.googleapis.com (optional)
|
||||||
|
#
|
||||||
|
# Option 4 — GitHub Models:
|
||||||
|
# CLAUDE_CODE_USE_GITHUB=1
|
||||||
|
# GITHUB_TOKEN=ghp_your-token-here
|
||||||
|
#
|
||||||
|
# Option 5 — Ollama (local):
|
||||||
|
# CLAUDE_CODE_USE_OPENAI=1
|
||||||
|
# OPENAI_BASE_URL=http://localhost:11434/v1
|
||||||
|
# OPENAI_API_KEY=ollama
|
||||||
|
# OPENAI_MODEL=llama3.2
|
||||||
|
#
|
||||||
|
# Option 6 — LM Studio (local):
|
||||||
|
# CLAUDE_CODE_USE_OPENAI=1
|
||||||
|
# OPENAI_BASE_URL=http://localhost:1234/v1
|
||||||
|
# OPENAI_MODEL=your-model-id-here
|
||||||
|
# OPENAI_API_KEY=lmstudio (optional)
|
||||||
|
#
|
||||||
|
# Option 7 — AWS Bedrock (may also need: aws configure):
|
||||||
|
# CLAUDE_CODE_USE_BEDROCK=1
|
||||||
|
# AWS_REGION=us-east-1
|
||||||
|
# AWS_DEFAULT_REGION=us-east-1
|
||||||
|
# AWS_BEARER_TOKEN_BEDROCK=your-bearer-token-here
|
||||||
|
# ANTHROPIC_BEDROCK_BASE_URL=https://bedrock-runtime.us-east-1.amazonaws.com
|
||||||
|
#
|
||||||
|
# Option 8 — Google Vertex AI:
|
||||||
|
# CLAUDE_CODE_USE_VERTEX=1
|
||||||
|
# ANTHROPIC_VERTEX_PROJECT_ID=your-gcp-project-id
|
||||||
|
# CLOUD_ML_REGION=us-east5
|
||||||
|
# GOOGLE_CLOUD_PROJECT=your-gcp-project-id
|
||||||
|
#
|
||||||
|
# ── How to set variables on each OS ──────────────────────────────────
|
||||||
|
#
|
||||||
|
# macOS (zsh):
|
||||||
|
# 1. Open: nano ~/.zshrc
|
||||||
|
# 2. Add each variable as: export VAR_NAME=value
|
||||||
|
# 3. Save and reload: source ~/.zshrc
|
||||||
|
#
|
||||||
|
# Linux (bash):
|
||||||
|
# 1. Open: nano ~/.bashrc
|
||||||
|
# 2. Add each variable as: export VAR_NAME=value
|
||||||
|
# 3. Save and reload: source ~/.bashrc
|
||||||
|
#
|
||||||
|
# Windows (PowerShell):
|
||||||
|
# Run for each variable:
|
||||||
|
# [System.Environment]::SetEnvironmentVariable('VAR_NAME', 'value', 'User')
|
||||||
|
# Then restart your terminal.
|
||||||
|
#
|
||||||
|
# Windows (Command Prompt):
|
||||||
|
# Run for each variable:
|
||||||
|
# setx VAR_NAME value
|
||||||
|
# Then restart your terminal.
|
||||||
|
#
|
||||||
|
# Windows (GUI):
|
||||||
|
# Settings > System > About > Advanced System Settings >
|
||||||
|
# Environment Variables > under "User variables" click New,
|
||||||
|
# then add each variable.
|
||||||
|
#
|
||||||
|
# ── Important notes ──────────────────────────────────────────────────
|
||||||
|
#
|
||||||
|
# LOCAL SERVERS: If using LM Studio or Ollama, the server MUST be
|
||||||
|
# running with a model loaded before you launch OpenClaude —
|
||||||
|
# otherwise you'll get connection errors.
|
||||||
|
#
|
||||||
|
# SWITCHING PROVIDERS: To temporarily switch, unset the relevant
|
||||||
|
# variables in your current terminal session:
|
||||||
|
#
|
||||||
|
# macOS / Linux:
|
||||||
|
# unset VAR_NAME
|
||||||
|
# # e.g.: unset CLAUDE_CODE_USE_OPENAI OPENAI_BASE_URL OPENAI_MODEL
|
||||||
|
#
|
||||||
|
# Windows (PowerShell — current session only):
|
||||||
|
# Remove-Item Env:VAR_NAME
|
||||||
|
#
|
||||||
|
# To permanently remove a variable on Windows:
|
||||||
|
# [System.Environment]::SetEnvironmentVariable('VAR_NAME', $null, 'User')
|
||||||
|
#
|
||||||
|
# LOAD ORDER:
|
||||||
|
# Shell and system environment variables are inherited by the process.
|
||||||
|
# Project .env files are only used if your launcher or shell loads them
|
||||||
|
# before starting OpenClaude.
|
||||||
|
# COMPATIBILITY:
|
||||||
|
# System-wide variables work regardless of how you run OpenClaude:
|
||||||
|
# npx, global npm install, bun run, or node directly. Any process
|
||||||
|
# launched from your terminal inherits your shell's environment.
|
||||||
|
#
|
||||||
|
# REMINDER: Make sure .env is in your .gitignore to avoid committing secrets.
|
||||||
|
# =============================================================================
|
||||||
|
|
||||||
# =============================================================================
|
# =============================================================================
|
||||||
# PROVIDER SELECTION — uncomment ONE block below
|
# PROVIDER SELECTION — uncomment ONE block below
|
||||||
@@ -44,7 +158,7 @@ ANTHROPIC_API_KEY=sk-ant-your-key-here
|
|||||||
# GEMINI_MODEL=gemini-2.0-flash
|
# GEMINI_MODEL=gemini-2.0-flash
|
||||||
|
|
||||||
# Use a custom Gemini endpoint (optional)
|
# Use a custom Gemini endpoint (optional)
|
||||||
# GEMINI_BASE_URL=https://generativelanguage.googleapis.com
|
# GEMINI_BASE_URL=https://generativelanguage.googleapis.com/v1beta/openai
|
||||||
|
|
||||||
|
|
||||||
# -----------------------------------------------------------------------------
|
# -----------------------------------------------------------------------------
|
||||||
@@ -62,10 +176,39 @@ ANTHROPIC_API_KEY=sk-ant-your-key-here
|
|||||||
# OPENAI_API_KEY=ollama
|
# OPENAI_API_KEY=ollama
|
||||||
# OPENAI_MODEL=llama3.2
|
# OPENAI_MODEL=llama3.2
|
||||||
|
|
||||||
|
# -----------------------------------------------------------------------------
|
||||||
|
# Option 6: LM Studio (local models)
|
||||||
|
# -----------------------------------------------------------------------------
|
||||||
|
# LM Studio exposes an OpenAI-compatible API, so we use the OpenAI provider.
|
||||||
|
# Make sure LM Studio is running with the Developer server enabled
|
||||||
|
# (Developer tab > toggle server ON).
|
||||||
|
#
|
||||||
|
# Steps:
|
||||||
|
# 1. Download and install LM Studio from https://lmstudio.ai
|
||||||
|
# 2. Search for and download a model (e.g. any coding or instruct model)
|
||||||
|
# 3. Load the model and start the Developer server
|
||||||
|
# 4. Set OPENAI_MODEL to the model ID shown in LM Studio's Developer tab
|
||||||
|
#
|
||||||
|
# The default server URL is http://localhost:1234 — change the port below
|
||||||
|
# if you've configured a different one in LM Studio.
|
||||||
|
#
|
||||||
|
# OPENAI_API_KEY is optional — LM Studio runs locally and ignores it.
|
||||||
|
# Some clients require a non-empty value; if you get auth errors, set it
|
||||||
|
# to any dummy value (e.g. "lmstudio").
|
||||||
|
#
|
||||||
|
# CLAUDE_CODE_USE_OPENAI=1
|
||||||
|
# OPENAI_BASE_URL=http://localhost:1234/v1
|
||||||
|
# OPENAI_MODEL=your-model-id-here
|
||||||
|
|
||||||
|
|
||||||
# -----------------------------------------------------------------------------
|
# -----------------------------------------------------------------------------
|
||||||
# Option 6: AWS Bedrock
|
# Option 7: AWS Bedrock
|
||||||
# -----------------------------------------------------------------------------
|
# -----------------------------------------------------------------------------
|
||||||
|
|
||||||
|
# You may also need AWS CLI credentials configured (run: aws configure)
|
||||||
|
# or have AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY set in your
|
||||||
|
# environment in addition to the variables below.
|
||||||
|
#
|
||||||
# CLAUDE_CODE_USE_BEDROCK=1
|
# CLAUDE_CODE_USE_BEDROCK=1
|
||||||
# AWS_REGION=us-east-1
|
# AWS_REGION=us-east-1
|
||||||
# AWS_DEFAULT_REGION=us-east-1
|
# AWS_DEFAULT_REGION=us-east-1
|
||||||
@@ -74,7 +217,7 @@ ANTHROPIC_API_KEY=sk-ant-your-key-here
|
|||||||
|
|
||||||
|
|
||||||
# -----------------------------------------------------------------------------
|
# -----------------------------------------------------------------------------
|
||||||
# Option 7: Google Vertex AI
|
# Option 8: Google Vertex AI
|
||||||
# -----------------------------------------------------------------------------
|
# -----------------------------------------------------------------------------
|
||||||
# CLAUDE_CODE_USE_VERTEX=1
|
# CLAUDE_CODE_USE_VERTEX=1
|
||||||
# ANTHROPIC_VERTEX_PROJECT_ID=your-gcp-project-id
|
# ANTHROPIC_VERTEX_PROJECT_ID=your-gcp-project-id
|
||||||
|
|||||||
Reference in New Issue
Block a user