Files
orcs-code/src
Juan Camilo 708a0a18fe fix: report cached tokens from OpenAI prompt_tokens_details
OpenAI returns cached token counts in usage.prompt_tokens_details.cached_tokens
but the shim hardcoded cache_read_input_tokens to 0. This made prompt
caching invisible to the cost tracker and session summary even when
OpenAI's automatic caching was actively reducing costs.

Changes:
- Extend OpenAIStreamChunk usage interface with prompt_tokens_details
- Map cached_tokens to cache_read_input_tokens in convertChunkUsage()
- Same fix in _convertNonStreamingResponse() for non-streaming path
- cache_creation_input_tokens remains 0 (OpenAI auto-caching has no
  creation cost — it is free and automatic)
2026-04-02 15:21:37 +02:00
..
2026-03-31 03:34:03 -07:00
2026-03-31 03:34:03 -07:00
2026-04-02 14:55:05 +05:30
2026-03-31 03:34:03 -07:00
2026-03-31 03:34:03 -07:00
2026-03-31 03:34:03 -07:00
2026-03-31 03:34:03 -07:00
2026-03-31 03:34:03 -07:00
2026-03-31 03:34:03 -07:00
2026-03-31 03:34:03 -07:00
2026-03-31 03:34:03 -07:00
2026-03-31 03:34:03 -07:00
2026-03-31 03:34:03 -07:00
2026-03-31 03:34:03 -07:00
2026-03-31 03:34:03 -07:00
2026-03-31 03:34:03 -07:00
2026-03-31 03:34:03 -07:00
2026-03-31 03:34:03 -07:00
2026-03-31 03:34:03 -07:00
2026-03-31 03:34:03 -07:00
2026-03-31 03:34:03 -07:00
2026-03-31 03:34:03 -07:00
2026-03-31 03:34:03 -07:00
2026-03-31 03:34:03 -07:00
2026-03-31 03:34:03 -07:00
2026-03-31 03:34:03 -07:00
2026-03-31 03:34:03 -07:00
2026-03-31 03:34:03 -07:00
2026-03-31 03:34:03 -07:00
2026-03-31 03:34:03 -07:00
2026-03-31 03:34:03 -07:00
2026-03-31 03:34:03 -07:00