Three default model lists hardcoded mistral:latest as the fallback
when config.get("model_sets" / "models") returns nothing. Per
feedback_no_mistral.md, mistral 7B has decoder-level JSON malformation
issues (0/5 fill rate on A/B) and is a liability in any path that
depends on structured output from the model.
Swapping to ollama_cloud::gpt-oss:120b (Phase 20 T3 cloud tier)
keeps the defaults reliable for the meta-pipeline orchestrator
(line 9959), the fallback model list for empty Ollama (10084), and
the worker pool default (11835). All three are DEFAULTS — any caller
passing explicit config.model_sets / config.models is unaffected.
Routing works because query_model's "::" provider prefix already
resolves ollama_cloud via commit fa6ccff. Activation requires
OLLAMA_CLOUD_API_KEY or a key saved via the Admin UI; this PR does
not change credential behavior, only the default model list.
Surfaced by lakehouse scrum-master pipeline run 2026-04-24, findings
confirmed by grep verification against the live code.