PRD line 70: "Everything runs locally — no cloud APIs, total data privacy." Yesterday's PR #13 (feb638e) violated this by routing customer-facing inference paths to opencode + ollama_cloud + openrouter. Reverting the hot-path routes only; cloud providers stay configured in providers.toml for explicit dev-tool opt-in. Reverted: - modes.toml staffing_inference: kimi-k2.6 → qwen3.5:latest (local Ollama) - modes.toml doc_drift_check: gemini-3-flash-preview → qwen3.5:latest - execution_loop overseer: opencode/claude-opus-4-7 → ollama/qwen3.5:latest Was a paid Anthropic call on every overseer escalation; now local + free. Gateway compiles + restarts clean. Lance smoke 10/10. Live providers list unchanged (kimi/ollama_cloud/opencode/openrouter all still CONFIGURED; they just aren't ROUTED to from the staffing inference path anymore). This stops the API meter on customer requests. Cloud providers remain opt-in via explicit provider= caller hint, which the scrum tool + auditor pipeline + bot/propose use deliberately. Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
Description
Rust-first object storage system
Languages
TypeScript
38.4%
Rust
35.8%
HTML
13.9%
Python
7.8%
Shell
2.1%
Other
2%