* feat(provider): first-class Moonshot (Kimi) direct-API support Moonshot's direct API (api.moonshot.ai/v1) is OpenAI-compatible and works today via the generic OpenAI shim, including the reasoning_content channel that Kimi returns alongside the user-visible content. But the UX was rough: unknown context window triggered the conservative 128k fallback + a warning, and the provider displayed as "Local OpenAI-compatible". Makes Moonshot a recognized provider: - src/utils/model/openaiContextWindows.ts: add the Kimi K2 family and moonshot-v1-* variants to both the context-window and max-output tables. Values from Moonshot's model card — K2.6 and K2-thinking are 256K, K2/K2-instruct are 128K, moonshot-v1 sizes are embedded in the model id. - src/utils/providerDiscovery.ts: recognize the api.moonshot.ai hostname and label it "Moonshot (Kimi)" in the startup banner and provider UI. Users can now launch with: CLAUDE_CODE_USE_OPENAI=1 \ OPENAI_BASE_URL=https://api.moonshot.ai/v1 \ OPENAI_API_KEY=sk-... \ OPENAI_MODEL=kimi-k2.6 \ openclaude and get accurate compaction + correct labeling + correct max_tokens out of the box. Co-Authored-By: OpenClaude <openclaude@gitlawb.com> * fix(openai-shim): Moonshot API compatibility — max_tokens + strip store Moonshot's direct API (api.moonshot.ai and api.moonshot.cn) uses the classic OpenAI `max_tokens` parameter, not the newer `max_completion_tokens` that the shim defaults to. It also hasn't published support for `store` and may reject it on strict-parse — same class of error as Gemini's "Unknown name 'store': Cannot find field" 400. - Adds isMoonshotBaseUrl() that recognizes both .ai and .cn hosts. - Converts max_completion_tokens → max_tokens for Moonshot requests (alongside GitHub / Mistral / local providers). - Strips body.store for Moonshot requests (alongside Mistral / Gemini). Two shim tests cover both the .ai and .cn hostnames. Co-Authored-By: OpenClaude <openclaude@gitlawb.com> * fix: null-safe access on getCachedMCConfig() in external builds External builds stub src/services/compact/cachedMicrocompact.ts so getCachedMCConfig() returns null, but two call sites still dereferenced config.supportedModels directly. The ?. operator was in the wrong place (config.supportedModels? instead of config?.supportedModels), so the null config threw "Cannot read properties of null (reading 'supportedModels')" on every request. Reproduces with any external-build provider (notably Kimi/Moonshot just enabled in the sibling commits, but equally DeepSeek, Mistral, Groq, Ollama, etc.): ❯ hey ⏺ Cannot read properties of null (reading 'supportedModels') - prompts.ts: early-return from getFunctionResultClearingSection() when config is null, before touching .supportedModels. - claude.ts: guard the debug-log jsonStringify with ?. so the log line never throws. Co-Authored-By: OpenClaude <openclaude@gitlawb.com> * fix(startup): show "Moonshot (Kimi)" on the startup banner The startup-screen provider detector had regex branches for OpenRouter, DeepSeek, Groq, Together, Azure, etc., but nothing for Moonshot. Remote Moonshot sessions fell through to the generic "OpenAI" label — getLocalOpenAICompatibleProviderLabel() only runs for local URLs, and api.moonshot.ai / api.moonshot.cn are not local. Adds a Moonshot branch matching /moonshot/ in the base URL OR /kimi/ in the model id. Now launches with: OPENAI_BASE_URL=https://api.moonshot.ai/v1 OPENAI_MODEL=kimi-k2.6 display the Provider row as "Moonshot (Kimi)" instead of "OpenAI". Co-Authored-By: OpenClaude <openclaude@gitlawb.com> * refactor(provider): sort preset picker alphabetically; Custom at end The /provider preset picker was in ad-hoc order (Anthropic, Ollama, OpenAI, then a jumble of third-party / local / codex / Alibaba / custom / nvidia / minimax). Hard to scan when you know the provider name you want. Sorts the list alphabetically by label A→Z. Pins "Custom" to the end — it's the catch-all / escape hatch so it's scanned last, not shuffled into the alphabetical run where a user looking for a named provider might grab it by mistake. First-run-only "Skip for now" stays at the very bottom, after Custom. Test churn: - ProviderManager.test.tsx: four tests hardcoded press counts (1 or 3 'j' presses) that broke when targets moved. Replaces them with a navigateToPreset(stdin, label) helper driven from a declared PRESET_ORDER array, so future list edits only update the array. - ConsoleOAuthFlow.test.tsx: the 13-row test frame only renders the first ~13 providers. "Ollama", "OpenAI", "LM Studio" sentinels moved below the fold; swap them for alphabetically-early providers still visible in-frame ("Azure OpenAI", "DeepSeek", "Google Gemini"). Test intent (picker opened with providers listed) is preserved. Co-Authored-By: OpenClaude <openclaude@gitlawb.com> --------- Co-authored-by: OpenClaude <openclaude@gitlawb.com>
122 lines
3.1 KiB
TypeScript
122 lines
3.1 KiB
TypeScript
import { PassThrough } from 'node:stream'
|
|
|
|
import { expect, test } from 'bun:test'
|
|
import React from 'react'
|
|
import stripAnsi from 'strip-ansi'
|
|
|
|
import { AppStateProvider } from '../state/AppState.js'
|
|
import { createRoot } from '../ink.js'
|
|
import { KeybindingSetup } from '../keybindings/KeybindingProviderSetup.js'
|
|
import { ConsoleOAuthFlow } from './ConsoleOAuthFlow.js'
|
|
|
|
const SYNC_START = '\x1B[?2026h'
|
|
const SYNC_END = '\x1B[?2026l'
|
|
|
|
function extractLastFrame(output: string): string {
|
|
let lastFrame: string | null = null
|
|
let cursor = 0
|
|
|
|
while (cursor < output.length) {
|
|
const start = output.indexOf(SYNC_START, cursor)
|
|
if (start === -1) {
|
|
break
|
|
}
|
|
|
|
const contentStart = start + SYNC_START.length
|
|
const end = output.indexOf(SYNC_END, contentStart)
|
|
if (end === -1) {
|
|
break
|
|
}
|
|
|
|
const frame = output.slice(contentStart, end)
|
|
if (frame.trim().length > 0) {
|
|
lastFrame = frame
|
|
}
|
|
cursor = end + SYNC_END.length
|
|
}
|
|
|
|
return lastFrame ?? output
|
|
}
|
|
|
|
function createTestStreams(): {
|
|
stdout: PassThrough
|
|
stdin: PassThrough & {
|
|
isTTY: boolean
|
|
setRawMode: (mode: boolean) => void
|
|
ref: () => void
|
|
unref: () => void
|
|
}
|
|
getOutput: () => string
|
|
} {
|
|
let output = ''
|
|
const stdout = new PassThrough()
|
|
const stdin = new PassThrough() as PassThrough & {
|
|
isTTY: boolean
|
|
setRawMode: (mode: boolean) => void
|
|
ref: () => void
|
|
unref: () => void
|
|
}
|
|
|
|
stdin.isTTY = true
|
|
stdin.setRawMode = () => {}
|
|
stdin.ref = () => {}
|
|
stdin.unref = () => {}
|
|
;(stdout as unknown as { columns: number }).columns = 120
|
|
stdout.on('data', chunk => {
|
|
output += chunk.toString()
|
|
})
|
|
|
|
return {
|
|
stdout,
|
|
stdin,
|
|
getOutput: () => output,
|
|
}
|
|
}
|
|
|
|
async function renderFrame(node: React.ReactNode): Promise<string> {
|
|
const { stdout, stdin, getOutput } = createTestStreams()
|
|
const root = await createRoot({
|
|
stdout: stdout as unknown as NodeJS.WriteStream,
|
|
stdin: stdin as unknown as NodeJS.ReadStream,
|
|
patchConsole: false,
|
|
})
|
|
|
|
root.render(
|
|
<AppStateProvider>
|
|
<KeybindingSetup>{node}</KeybindingSetup>
|
|
</AppStateProvider>,
|
|
)
|
|
|
|
await Bun.sleep(50)
|
|
root.unmount()
|
|
stdin.end()
|
|
stdout.end()
|
|
await Bun.sleep(25)
|
|
|
|
return stripAnsi(extractLastFrame(getOutput()))
|
|
}
|
|
|
|
test('login picker shows the third-party platform option', async () => {
|
|
const output = await renderFrame(<ConsoleOAuthFlow onDone={() => {}} />)
|
|
|
|
expect(output).toContain('Select login method:')
|
|
expect(output).toContain('3rd-party platform')
|
|
})
|
|
|
|
test('third-party provider branch opens the first-run provider manager', async () => {
|
|
const output = await renderFrame(
|
|
<ConsoleOAuthFlow
|
|
initialStatus={{ state: 'platform_setup' }}
|
|
onDone={() => {}}
|
|
/>,
|
|
)
|
|
|
|
expect(output).toContain('Set up provider')
|
|
// Use alphabetically-early sentinels so they remain visible in the
|
|
// 13-row test frame after the provider list was sorted A→Z.
|
|
expect(output).toContain('Anthropic')
|
|
expect(output).toContain('Azure OpenAI')
|
|
expect(output).toContain('DeepSeek')
|
|
expect(output).toContain('Google Gemini')
|
|
})
|