feat: add .env.example with all provider configurations
New contributors had to hunt through README and source files to find required environment variables. This adds a single reference file at repo root covering all supported providers with placeholder values, inline comments, and sensible defaults. Providers covered: - Anthropic (default) - OpenAI - Google Gemini - GitHub Models - Ollama (local) - AWS Bedrock - Google Vertex AI Also includes optional tuning vars: CLAUDE_CODE_MAX_RETRIES, CLAUDE_CODE_UNATTENDED_RETRY, OPENCLAUDE_ENABLE_EXTENDED_KEYS, OPENCLAUDE_DISABLE_CO_AUTHORED_BY, API_TIMEOUT_MS, CLAUDE_DEBUG. Updated .gitignore to add !.env.example exception so the template is not suppressed by the existing .env.* rule. Closes #175 Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
This commit is contained in:
107
.env.example
Normal file
107
.env.example
Normal file
@@ -0,0 +1,107 @@
|
||||
# =============================================================================
|
||||
# OpenClaude Environment Configuration
|
||||
# =============================================================================
|
||||
# Copy this file to .env and fill in your values:
|
||||
# cp .env.example .env
|
||||
#
|
||||
# Only set the variables for the provider you want to use.
|
||||
# All other sections can be left commented out.
|
||||
# =============================================================================
|
||||
|
||||
|
||||
# =============================================================================
|
||||
# PROVIDER SELECTION — uncomment ONE block below
|
||||
# =============================================================================
|
||||
|
||||
# -----------------------------------------------------------------------------
|
||||
# Option 1: Anthropic (default — no provider flag needed)
|
||||
# -----------------------------------------------------------------------------
|
||||
ANTHROPIC_API_KEY=sk-ant-your-key-here
|
||||
|
||||
# Override the default model (optional)
|
||||
# ANTHROPIC_MODEL=claude-sonnet-4-5
|
||||
|
||||
# Use a custom Anthropic-compatible endpoint (optional)
|
||||
# ANTHROPIC_BASE_URL=https://api.anthropic.com
|
||||
|
||||
|
||||
# -----------------------------------------------------------------------------
|
||||
# Option 2: OpenAI
|
||||
# -----------------------------------------------------------------------------
|
||||
# CLAUDE_CODE_USE_OPENAI=1
|
||||
# OPENAI_API_KEY=sk-your-key-here
|
||||
# OPENAI_MODEL=gpt-4o
|
||||
|
||||
# Use a custom OpenAI-compatible endpoint (optional — defaults to api.openai.com)
|
||||
# OPENAI_BASE_URL=https://api.openai.com/v1
|
||||
|
||||
|
||||
# -----------------------------------------------------------------------------
|
||||
# Option 3: Google Gemini
|
||||
# -----------------------------------------------------------------------------
|
||||
# CLAUDE_CODE_USE_GEMINI=1
|
||||
# GEMINI_API_KEY=your-gemini-key-here
|
||||
# GEMINI_MODEL=gemini-2.0-flash
|
||||
|
||||
# Use a custom Gemini endpoint (optional)
|
||||
# GEMINI_BASE_URL=https://generativelanguage.googleapis.com
|
||||
|
||||
|
||||
# -----------------------------------------------------------------------------
|
||||
# Option 4: GitHub Models
|
||||
# -----------------------------------------------------------------------------
|
||||
# CLAUDE_CODE_USE_GITHUB=1
|
||||
# GITHUB_TOKEN=ghp_your-token-here
|
||||
|
||||
|
||||
# -----------------------------------------------------------------------------
|
||||
# Option 5: Ollama (local models)
|
||||
# -----------------------------------------------------------------------------
|
||||
# CLAUDE_CODE_USE_OPENAI=1
|
||||
# OPENAI_BASE_URL=http://localhost:11434/v1
|
||||
# OPENAI_API_KEY=ollama
|
||||
# OPENAI_MODEL=llama3.2
|
||||
|
||||
|
||||
# -----------------------------------------------------------------------------
|
||||
# Option 6: AWS Bedrock
|
||||
# -----------------------------------------------------------------------------
|
||||
# CLAUDE_CODE_USE_BEDROCK=1
|
||||
# AWS_REGION=us-east-1
|
||||
# AWS_DEFAULT_REGION=us-east-1
|
||||
# AWS_BEARER_TOKEN_BEDROCK=your-bearer-token-here
|
||||
# ANTHROPIC_BEDROCK_BASE_URL=https://bedrock-runtime.us-east-1.amazonaws.com
|
||||
|
||||
|
||||
# -----------------------------------------------------------------------------
|
||||
# Option 7: Google Vertex AI
|
||||
# -----------------------------------------------------------------------------
|
||||
# CLAUDE_CODE_USE_VERTEX=1
|
||||
# ANTHROPIC_VERTEX_PROJECT_ID=your-gcp-project-id
|
||||
# CLOUD_ML_REGION=us-east5
|
||||
# GOOGLE_CLOUD_PROJECT=your-gcp-project-id
|
||||
|
||||
|
||||
# =============================================================================
|
||||
# OPTIONAL TUNING
|
||||
# =============================================================================
|
||||
|
||||
# Max number of API retries on failure (default: 10)
|
||||
# CLAUDE_CODE_MAX_RETRIES=10
|
||||
|
||||
# Enable persistent retry mode for unattended/CI sessions
|
||||
# Retries 429/529 indefinitely with smart backoff
|
||||
# CLAUDE_CODE_UNATTENDED_RETRY=1
|
||||
|
||||
# Enable extended key reporting (Kitty keyboard protocol)
|
||||
# Useful for iTerm2, WezTerm, Ghostty if modifier keys feel off
|
||||
# OPENCLAUDE_ENABLE_EXTENDED_KEYS=1
|
||||
|
||||
# Disable "Co-authored-by" line in git commits made by OpenClaude
|
||||
# OPENCLAUDE_DISABLE_CO_AUTHORED_BY=1
|
||||
|
||||
# Custom timeout for API requests in milliseconds (default: varies)
|
||||
# API_TIMEOUT_MS=60000
|
||||
|
||||
# Enable debug logging
|
||||
# CLAUDE_DEBUG=1
|
||||
Reference in New Issue
Block a user