* fix: make OpenAI fallback context window configurable and support external lookup table
Unknown OpenAI-compatible models fell back to a hardcoded 128k constant,
causing auto-compact to fire prematurely on models with larger windows
(issue #635 follow-up). Two escape hatches are added without touching the
built-in table:
- CLAUDE_CODE_OPENAI_FALLBACK_CONTEXT_WINDOW (number): overrides the 128k
default for all unknown models.
- CLAUDE_CODE_OPENAI_CONTEXT_WINDOWS (JSON object): per-model overrides that
take precedence over the built-in OPENAI_CONTEXT_WINDOWS table; supports
the same provider-qualified and prefix-matching lookup as the built-in path.
- CLAUDE_CODE_OPENAI_MAX_OUTPUT_TOKENS (JSON object): same pattern for output
token limits.
This lets operators deploy new or private models without patching
openaiContextWindows.ts on every model release.
* docs: add new OpenAI context window env vars to .env.example
Document CLAUDE_CODE_OPENAI_FALLBACK_CONTEXT_WINDOW,
CLAUDE_CODE_OPENAI_CONTEXT_WINDOWS, and
CLAUDE_CODE_OPENAI_MAX_OUTPUT_TOKENS with usage examples.
Addresses reviewer feedback on PR #861.
---------
Co-authored-by: opencode <dev@example.com>