Files
gnoma/internal/router
vikingowl 34108075d2 feat: auto-discover local models from ollama + llama.cpp
At startup, polls ollama (/api/tags) and llama.cpp (/v1/models) for
available models. Registers each as an arm in the router alongside
the CLI-specified provider.

Discovered: 7 ollama models + 1 llama.cpp model = 9 total arms.
Router can now select from multiple local models based on task type.
Discovery is non-blocking — failures logged and skipped.
2026-04-03 17:53:11 +02:00
..