Files
owlen/crates/app/cli
vikingowl f97bd44f05 feat(engine): Implement dynamic provider/model switching
Add shared ProviderManager architecture for runtime provider/model switching
between TUI and Engine:

Core Architecture:
- Add SwitchProvider and SwitchModel messages to UserAction enum
- Create run_engine_loop_dynamic() with shared ProviderManager
- Add ClientSource enum to AgentManager (Fixed vs Dynamic)
- Implement get_client() that resolves provider at call time

TUI Integration:
- Add ProviderMode::Shared variant for shared manager
- Add with_shared_provider_manager() constructor
- Update switch_provider/set_current_model for shared mode
- Fix /model command to update shared ProviderManager (was only
  updating local TUI state, not propagating to engine)
- Fix /provider command to use switch_provider()

Infrastructure:
- Wire main.rs to create shared ProviderManager for both TUI and engine
- Add HTTP status code validation to Ollama client
- Consolidate messages.rs and state.rs into agent-core

Both TUI and Engine now share the same ProviderManager via
Arc<Mutex<>>. Provider/model changes via [1]/[2]/[3] keys, model
picker, or /model command now properly propagate to the engine.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-12-26 22:13:00 +01:00
..

Owlen CLI

The command-line interface for the Owlen AI agent.

Features

  • Interactive Chat: Communicate with the AI agent directly from your terminal.
  • Tool Integration: Built-in support for filesystem operations, bash execution, and more.
  • Provider Management: Easily switch between different LLM providers (Ollama, Anthropic, OpenAI).
  • Session Management: Persist conversation history and resume previous sessions.
  • Secure Authentication: Managed authentication flows for major AI providers.

Usage

Direct Invocation

# Start an interactive chat session
owlen

# Ask a single question
owlen "How do I list files in Rust?"

Commands

  • owlen config: View or modify agent configuration.
  • owlen login <provider>: Authenticate with a specific LLM provider.
  • owlen session: Manage chat sessions.

Configuration

Owlen uses a global configuration file located at ~/.config/owlen/config.toml. You can also provide project-specific settings via an .owlen.toml file in your project root.