feat(zai): add Z.AI GLM Coding Plan provider preset (#896)

* feat(zai): add Z.AI GLM Coding Plan provider preset

Add dedicated Z.AI provider support for the GLM Coding Plan, enabling
use of GLM-5.1, GLM-5-Turbo, GLM-4.7, and GLM-4.5-Air models through
the OpenAI-compatible shim with proper thinking mode (reasoning_content),
max_tokens handling, and context window sizing.

* fix(zai): unify GLM max output token limits across casing variants

glm-5/glm-4.7 had conservative 16K max output while GLM-5/GLM-4.7
had 131K. Use consistent Z.AI coding plan limits for all GLM variants.

* fix(zai): restore DashScope GLM limits, enable GLM thinking support

- Restore lowercase glm-5/glm-4.7 to 16_384 max output (DashScope limits)
  while keeping Z.AI coding plan high limits on uppercase GLM-* keys only
- Add GLM model support to modelSupportsThinking() so reasoning_content
  is enabled when using GLM-5.x/GLM-4.7 models on Z.AI

* fix(zai): tighten GLM regexes, fix misleading context window comment

- Use precise regex in thinking.ts: exact GLM model matches only,
  no false positives on glm-50/glm-4, includes glm-4.5-air
- Use uppercase-only match in StartupScreen rawModel fallback so
  DashScope lowercase glm-* models aren't mislabeled as Z.AI
- Clarify context window comment: lowercase glm-5.1/glm-5-turbo/
  glm-4.5-air are Z.AI-specific aliases, not DashScope

* fix(zai): scope GLM detection to Z.AI

* improve readability of max_completion_tokens check

Co-authored-by: Copilot Autofix powered by AI <175728472+Copilot@users.noreply.github.com>

---------

Co-authored-by: Copilot Autofix powered by AI <175728472+Copilot@users.noreply.github.com>
This commit is contained in:
chioarub
2026-04-26 03:18:59 +03:00
committed by GitHub
parent 29f7579377
commit a0d657ee18
16 changed files with 342 additions and 6 deletions

View File

@@ -6,6 +6,7 @@ import { getCanonicalName } from './model/model.js'
import { get3PModelCapabilityOverride } from './model/modelSupportOverrides.js'
import { getAPIProvider } from './model/providers.js'
import { getSettingsWithErrors } from './settings/settings.js'
import { isZaiBaseUrl, isZaiGlmModel } from './zaiProvider.js'
export type ThinkingConfig =
| { type: 'adaptive' }
@@ -111,6 +112,13 @@ export function modelSupportsThinking(model: string): boolean {
) {
return true
}
if (
provider === 'openai' &&
isZaiBaseUrl(process.env.OPENAI_BASE_URL ?? process.env.OPENAI_API_BASE) &&
isZaiGlmModel(canonical)
) {
return true
}
// 3P (Bedrock/Vertex): only Opus 4+ and Sonnet 4+
return canonical.includes('sonnet-4') || canonical.includes('opus-4')
}