Feat/kimi moonshot support (#805)
* feat(provider): first-class Moonshot (Kimi) direct-API support Moonshot's direct API (api.moonshot.ai/v1) is OpenAI-compatible and works today via the generic OpenAI shim, including the reasoning_content channel that Kimi returns alongside the user-visible content. But the UX was rough: unknown context window triggered the conservative 128k fallback + a warning, and the provider displayed as "Local OpenAI-compatible". Makes Moonshot a recognized provider: - src/utils/model/openaiContextWindows.ts: add the Kimi K2 family and moonshot-v1-* variants to both the context-window and max-output tables. Values from Moonshot's model card — K2.6 and K2-thinking are 256K, K2/K2-instruct are 128K, moonshot-v1 sizes are embedded in the model id. - src/utils/providerDiscovery.ts: recognize the api.moonshot.ai hostname and label it "Moonshot (Kimi)" in the startup banner and provider UI. Users can now launch with: CLAUDE_CODE_USE_OPENAI=1 \ OPENAI_BASE_URL=https://api.moonshot.ai/v1 \ OPENAI_API_KEY=sk-... \ OPENAI_MODEL=kimi-k2.6 \ openclaude and get accurate compaction + correct labeling + correct max_tokens out of the box. Co-Authored-By: OpenClaude <openclaude@gitlawb.com> * fix(openai-shim): Moonshot API compatibility — max_tokens + strip store Moonshot's direct API (api.moonshot.ai and api.moonshot.cn) uses the classic OpenAI `max_tokens` parameter, not the newer `max_completion_tokens` that the shim defaults to. It also hasn't published support for `store` and may reject it on strict-parse — same class of error as Gemini's "Unknown name 'store': Cannot find field" 400. - Adds isMoonshotBaseUrl() that recognizes both .ai and .cn hosts. - Converts max_completion_tokens → max_tokens for Moonshot requests (alongside GitHub / Mistral / local providers). - Strips body.store for Moonshot requests (alongside Mistral / Gemini). Two shim tests cover both the .ai and .cn hostnames. Co-Authored-By: OpenClaude <openclaude@gitlawb.com> * fix: null-safe access on getCachedMCConfig() in external builds External builds stub src/services/compact/cachedMicrocompact.ts so getCachedMCConfig() returns null, but two call sites still dereferenced config.supportedModels directly. The ?. operator was in the wrong place (config.supportedModels? instead of config?.supportedModels), so the null config threw "Cannot read properties of null (reading 'supportedModels')" on every request. Reproduces with any external-build provider (notably Kimi/Moonshot just enabled in the sibling commits, but equally DeepSeek, Mistral, Groq, Ollama, etc.): ❯ hey ⏺ Cannot read properties of null (reading 'supportedModels') - prompts.ts: early-return from getFunctionResultClearingSection() when config is null, before touching .supportedModels. - claude.ts: guard the debug-log jsonStringify with ?. so the log line never throws. Co-Authored-By: OpenClaude <openclaude@gitlawb.com> * fix(startup): show "Moonshot (Kimi)" on the startup banner The startup-screen provider detector had regex branches for OpenRouter, DeepSeek, Groq, Together, Azure, etc., but nothing for Moonshot. Remote Moonshot sessions fell through to the generic "OpenAI" label — getLocalOpenAICompatibleProviderLabel() only runs for local URLs, and api.moonshot.ai / api.moonshot.cn are not local. Adds a Moonshot branch matching /moonshot/ in the base URL OR /kimi/ in the model id. Now launches with: OPENAI_BASE_URL=https://api.moonshot.ai/v1 OPENAI_MODEL=kimi-k2.6 display the Provider row as "Moonshot (Kimi)" instead of "OpenAI". Co-Authored-By: OpenClaude <openclaude@gitlawb.com> * refactor(provider): sort preset picker alphabetically; Custom at end The /provider preset picker was in ad-hoc order (Anthropic, Ollama, OpenAI, then a jumble of third-party / local / codex / Alibaba / custom / nvidia / minimax). Hard to scan when you know the provider name you want. Sorts the list alphabetically by label A→Z. Pins "Custom" to the end — it's the catch-all / escape hatch so it's scanned last, not shuffled into the alphabetical run where a user looking for a named provider might grab it by mistake. First-run-only "Skip for now" stays at the very bottom, after Custom. Test churn: - ProviderManager.test.tsx: four tests hardcoded press counts (1 or 3 'j' presses) that broke when targets moved. Replaces them with a navigateToPreset(stdin, label) helper driven from a declared PRESET_ORDER array, so future list edits only update the array. - ConsoleOAuthFlow.test.tsx: the 13-row test frame only renders the first ~13 providers. "Ollama", "OpenAI", "LM Studio" sentinels moved below the fold; swap them for alphabetically-early providers still visible in-frame ("Azure OpenAI", "DeepSeek", "Google Gemini"). Test intent (picker opened with providers listed) is preserved. Co-Authored-By: OpenClaude <openclaude@gitlawb.com> --------- Co-authored-by: OpenClaude <openclaude@gitlawb.com>
This commit is contained in:
@@ -1094,21 +1094,30 @@ export function ProviderManager({ mode, onDone }: Props): React.ReactNode {
|
||||
|
||||
function renderPresetSelection(): React.ReactNode {
|
||||
const canUseCodexOAuth = !isBareMode()
|
||||
// Providers sorted alphabetically by label. `Custom` is pinned to the end
|
||||
// because it's the catch-all / escape hatch — users scanning the list
|
||||
// should always find known providers first. `Skip for now` (first-run
|
||||
// only) comes last, after Custom.
|
||||
const options = [
|
||||
{
|
||||
value: 'dashscope-intl',
|
||||
label: 'Alibaba Coding Plan',
|
||||
description: 'Alibaba DashScope International endpoint',
|
||||
},
|
||||
{
|
||||
value: 'dashscope-cn',
|
||||
label: 'Alibaba Coding Plan (China)',
|
||||
description: 'Alibaba DashScope China endpoint',
|
||||
},
|
||||
{
|
||||
value: 'anthropic',
|
||||
label: 'Anthropic',
|
||||
description: 'Native Claude API (x-api-key auth)',
|
||||
},
|
||||
{
|
||||
value: 'ollama',
|
||||
label: 'Ollama',
|
||||
description: 'Local or remote Ollama endpoint',
|
||||
},
|
||||
{
|
||||
value: 'openai',
|
||||
label: 'OpenAI',
|
||||
description: 'OpenAI API with API key',
|
||||
value: 'azure-openai',
|
||||
label: 'Azure OpenAI',
|
||||
description: 'Azure OpenAI endpoint (model=deployment name)',
|
||||
},
|
||||
...(canUseCodexOAuth
|
||||
? [
|
||||
@@ -1120,11 +1129,6 @@ export function ProviderManager({ mode, onDone }: Props): React.ReactNode {
|
||||
},
|
||||
]
|
||||
: []),
|
||||
{
|
||||
value: 'moonshotai',
|
||||
label: 'Moonshot AI',
|
||||
description: 'Kimi OpenAI-compatible endpoint',
|
||||
},
|
||||
{
|
||||
value: 'deepseek',
|
||||
label: 'DeepSeek',
|
||||
@@ -1135,50 +1139,30 @@ export function ProviderManager({ mode, onDone }: Props): React.ReactNode {
|
||||
label: 'Google Gemini',
|
||||
description: 'Gemini OpenAI-compatible endpoint',
|
||||
},
|
||||
{
|
||||
value: 'together',
|
||||
label: 'Together AI',
|
||||
description: 'Together chat/completions endpoint',
|
||||
},
|
||||
{
|
||||
value: 'groq',
|
||||
label: 'Groq',
|
||||
description: 'Groq OpenAI-compatible endpoint',
|
||||
},
|
||||
{
|
||||
value: 'mistral',
|
||||
label: 'Mistral',
|
||||
description: 'Mistral OpenAI-compatible endpoint',
|
||||
},
|
||||
{
|
||||
value: 'azure-openai',
|
||||
label: 'Azure OpenAI',
|
||||
description: 'Azure OpenAI endpoint (model=deployment name)',
|
||||
},
|
||||
{
|
||||
value: 'openrouter',
|
||||
label: 'OpenRouter',
|
||||
description: 'OpenRouter OpenAI-compatible endpoint',
|
||||
},
|
||||
{
|
||||
value: 'lmstudio',
|
||||
label: 'LM Studio',
|
||||
description: 'Local LM Studio endpoint',
|
||||
},
|
||||
{
|
||||
value: 'dashscope-cn',
|
||||
label: 'Alibaba Coding Plan (China)',
|
||||
description: 'Alibaba DashScope China endpoint',
|
||||
value: 'minimax',
|
||||
label: 'MiniMax',
|
||||
description: 'MiniMax API endpoint',
|
||||
},
|
||||
{
|
||||
value: 'dashscope-intl',
|
||||
label: 'Alibaba Coding Plan',
|
||||
description: 'Alibaba DashScope International endpoint',
|
||||
value: 'mistral',
|
||||
label: 'Mistral',
|
||||
description: 'Mistral OpenAI-compatible endpoint',
|
||||
},
|
||||
{
|
||||
value: 'custom',
|
||||
label: 'Custom',
|
||||
description: 'Any OpenAI-compatible provider',
|
||||
value: 'moonshotai',
|
||||
label: 'Moonshot AI',
|
||||
description: 'Kimi OpenAI-compatible endpoint',
|
||||
},
|
||||
{
|
||||
value: 'nvidia-nim',
|
||||
@@ -1186,9 +1170,29 @@ export function ProviderManager({ mode, onDone }: Props): React.ReactNode {
|
||||
description: 'NVIDIA NIM endpoint',
|
||||
},
|
||||
{
|
||||
value: 'minimax',
|
||||
label: 'MiniMax',
|
||||
description: 'MiniMax API endpoint',
|
||||
value: 'ollama',
|
||||
label: 'Ollama',
|
||||
description: 'Local or remote Ollama endpoint',
|
||||
},
|
||||
{
|
||||
value: 'openai',
|
||||
label: 'OpenAI',
|
||||
description: 'OpenAI API with API key',
|
||||
},
|
||||
{
|
||||
value: 'openrouter',
|
||||
label: 'OpenRouter',
|
||||
description: 'OpenRouter OpenAI-compatible endpoint',
|
||||
},
|
||||
{
|
||||
value: 'together',
|
||||
label: 'Together AI',
|
||||
description: 'Together chat/completions endpoint',
|
||||
},
|
||||
{
|
||||
value: 'custom',
|
||||
label: 'Custom',
|
||||
description: 'Any OpenAI-compatible provider',
|
||||
},
|
||||
...(mode === 'first-run'
|
||||
? [
|
||||
|
||||
Reference in New Issue
Block a user