fix: custom OPENAI_BASE_URL always wins over Codex model alias detection (#222)
* feat: add --provider CLI flag for multi-provider support Adds a --provider flag that maps friendly provider names to the environment variables the codebase uses for provider detection. No more manual env-var configuration — users can now simply run: openclaude --provider openai --model gpt-4o openclaude --provider gemini --model gemini-2.0-flash openclaude --provider ollama --model llama3.2 openclaude --provider bedrock openclaude --provider vertex Implementation details: - providerFlag.ts: core logic — maps provider names to env vars, uses ??= so explicit env vars always win over the flag defaults - providerFlag.test.ts: 18 tests covering all 7 providers, error messages, model passthrough, and env-var precedence - cli.tsx: early fast-path (mirrors --bare pattern) — sets env vars before Commander option-building and module constants run - main.tsx: adds --provider to Commander option chain for --help Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com> * fix: custom OPENAI_BASE_URL always wins over Codex model alias detection When OPENAI_MODEL=gpt-5.4 (or gpt-5.4-mini) and a custom OPENAI_BASE_URL is set (Azure, OpenRouter, etc), the transport was incorrectly forced to codex_responses because gpt-5.4 is in CODEX_ALIAS_MODELS. This caused requests to be sent with Codex auth instead of the user's API key, resulting in 401 Unauthorized errors. Fix: only use codex_responses when the base URL is explicitly the Codex endpoint, OR when no custom base URL is set and the model is a Codex alias. An explicit OPENAI_BASE_URL always takes priority over model-name based Codex detection. Verified locally: gpt-5.4 via OpenRouter now correctly shows Provider=OpenRouter, Endpoint=https://openrouter.ai/api/v1 instead of routing to chatgpt.com/backend-api/codex. Fixes #200, #203 --------- Co-authored-by: Claude Sonnet 4.6 <noreply@anthropic.com>
This commit is contained in:
@@ -291,8 +291,13 @@ export function resolveProviderRequest(options?: {
|
||||
process.env.OPENAI_BASE_URL ??
|
||||
process.env.OPENAI_API_BASE ??
|
||||
undefined
|
||||
// Use Codex transport only when:
|
||||
// - the base URL is explicitly the Codex endpoint, OR
|
||||
// - the model is a Codex alias AND no custom base URL has been set
|
||||
// A custom OPENAI_BASE_URL (e.g. Azure, OpenRouter) always wins over
|
||||
// model-name-based Codex detection to prevent auth failures (#200, #203).
|
||||
const transport: ProviderTransport =
|
||||
isCodexAlias(requestedModel) || isCodexBaseUrl(rawBaseUrl)
|
||||
isCodexBaseUrl(rawBaseUrl) || (!rawBaseUrl && isCodexAlias(requestedModel))
|
||||
? 'codex_responses'
|
||||
: 'chat_completions'
|
||||
|
||||
|
||||
Reference in New Issue
Block a user