Add a new module that builds a structural map of the repository by parsing
source files with tree-sitter, building a cross-file reference graph
weighted by IDF, ranking files with PageRank, and rendering a
token-budgeted summary of the most important files and their signatures.
Stage 1 — Core module (src/context/repoMap/):
Symbol extraction via web-tree-sitter WASM, IDF-weighted reference graph
via graphology, PageRank ranking, token-budgeted rendering via js-tiktoken
cl100k_base, disk cache with mtime invalidation. Supports TypeScript,
JavaScript, and Python. 10 tests.
Stage 2 — RepoMap tool (src/tools/RepoMapTool/):
buildTool wrapper registered in src/tools.ts. Read-only, concurrency-safe.
Supports focus_files, focus_symbols, and max_tokens parameters. 9 tests.
Stage 3 — Integration:
Auto-injection into session context behind REPO_MAP feature flag (off by
default). /repomap slash command with --tokens, --focus, --stats, and
--invalidate flags. User-facing docs in docs/repo-map.md. 13 tests.
With the flag off, the system context is byte-identical to previous behavior.
Dependencies: web-tree-sitter, tree-sitter-wasms, graphology,
graphology-pagerank, graphology-operators, js-tiktoken
Tests: 32 new, 621 total passing, 0 failures.
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Document the new `ollama launch openclaude` command as a shortcut
for running OpenClaude through a local Ollama instance. This is
now supported in Ollama's launch system and handles all environment
variable setup automatically — no manual env vars needed.
Changes:
- README.md: Add "Using Ollama's launch command" section after the
manual Ollama env var setup, and update the provider table to
list `ollama launch` as a setup path for Ollama
- docs/advanced-setup.md: Add `ollama launch` as the recommended
method at the top of the Ollama section, with the manual env var
approach kept below as an alternative
* docs: add LiteLLM proxy setup guide
Document the setup process for LiteLLM and its integration with OpenClaude, including prerequisites, configuration, and troubleshooting steps.
* Revise LiteLLM setup steps for Adocs: fix /provider walkthrough to match actual OpenAI-compatible flowPI key and model
Updated setup instructions for LiteLLM provider configuration.
* docs: fix sub-bullet formatting in /provider steps
* docs: clarify key scope in troubleshooting (LiteLLM proxy process env)
Clarified instruction for upstream provider error regarding API key.
* ## PR: Add LM Studio Provider Support
### Summary
Adds comprehensive LM Studio integration to openclaude, following the same pattern as the existing Ollama provider. LM Studio is a popular local LLM inference tool that exposes an OpenAI-compatible API.
### Changes (4 files, 672 insertions)
**New Files:**
- `lmstudio_provider.py` (377 lines) - Full provider implementation with:
- Health check functions (`check_lmstudio_running`)
- Model listing (`list_lmstudio_models`)
- Chat completion (`lmstudio_chat`)
- Streaming support (`lmstudio_chat_stream`)
- Comprehensive docstring with setup instructions, troubleshooting, and model recommendations
- `test_lmstudio_provider.py` (227 lines) - Complete test suite with 12 passing tests covering:
- API URL construction
- Server health checks
- Model listing
- Chat completion functionality
**Modified Files:**
- `docs/quick-start-mac-linux.md` (+34 lines) - Added Option D: LM Studio with setup instructions and troubleshooting
- `docs/quick-start-windows.md` (+34 lines) - Added Option D: LM Studio with PowerShell syntax and troubleshooting
### Key Features
- No API key required (local inference)
- Default port: 1234 (LM Studio's standard)
- OpenAI-compatible API integration
- Consistent with existing provider patterns (Ollama, Atomic Chat)
- All tests passing (12/12)
### Usage
```bash
export CLAUDE_CODE_USE_OPENAI=1
export OPENAI_BASE_URL=http://localhost:1234/v1
export OPENAI_MODEL=your-model-name
openclaude
```
* made pr as doc only pr for lm studio
* LM studio recent ui changes fixes in doc
Split the setup documentation into a simple beginner path and a separate advanced path. Add OS-specific quick starts for Windows and macOS/Linux so non-technical users can copy and paste the right commands without sorting through Bun and source-build instructions.