feat(phase10): complete MCP-only architecture migration

This commit completes Phase 10 of the MCP migration by removing all
direct provider usage from CLI/TUI and enforcing MCP-first architecture.

## Changes

### Core Architecture
- **main.rs**: Replaced OllamaProvider with RemoteMcpClient
  - Uses MCP server configuration from config.toml if available
  - Falls back to auto-discovery of MCP LLM server binary
- **agent_main.rs**: Unified provider and MCP client to single RemoteMcpClient
  - Simplifies initialization with Arc::clone pattern
  - All LLM communication now goes through MCP protocol

### Dependencies
- **Cargo.toml**: Removed owlen-ollama dependency from owlen-cli
  - CLI no longer knows about Ollama implementation details
  - Clean separation: only MCP servers use provider crates internally

### Tests
- **agent_tests.rs**: Updated all tests to use RemoteMcpClient
  - Replaced OllamaProvider::new() with RemoteMcpClient::new()
  - Updated test documentation to reflect MCP requirements
  - All tests compile and run successfully

### Examples
- **Removed**: custom_provider.rs, basic_chat.rs (deprecated)
- **Added**: mcp_chat.rs - demonstrates recommended MCP-based usage
  - Shows how to use RemoteMcpClient for LLM interactions
  - Includes model listing and chat request examples

### Cleanup
- Removed outdated TODO about MCP integration (now complete)
- Updated comments to reflect current MCP architecture

## Architecture

```
CLI/TUI → RemoteMcpClient (impl Provider)
          ↓ MCP Protocol (STDIO/HTTP/WS)
          MCP LLM Server → OllamaProvider → Ollama
```

## Benefits
-  Clean separation of concerns
-  CLI is protocol-agnostic (only knows MCP)
-  Easier to add new LLM backends (just implement MCP server)
-  All tests passing
-  Full workspace builds successfully

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
This commit is contained in:
2025-10-10 22:29:20 +02:00
parent e94df2c48a
commit 9545a4b3ad
8 changed files with 113 additions and 117 deletions

View File

@@ -11,11 +11,15 @@ use std::sync::Arc;
use clap::Parser;
use owlen_cli::agent::{AgentConfig, AgentExecutor};
use owlen_core::mcp::remote_client::RemoteMcpClient;
use owlen_ollama::OllamaProvider;
/// Commandline arguments for the agent binary.
#[derive(Parser, Debug)]
#[command(name = "owlen-agent", author, version, about = "Run the ReAct agent")]
#[command(
name = "owlen-agent",
author,
version,
about = "Run the ReAct agent via MCP"
)]
struct Args {
/// The initial user query.
prompt: String,
@@ -31,11 +35,13 @@ struct Args {
async fn main() -> anyhow::Result<()> {
let args = Args::parse();
// Initialise the LLM provider (Ollama) uses default local URL.
let provider = Arc::new(OllamaProvider::new("http://localhost:11434")?);
// Initialise the MCP client (remote LLM server) this client also knows how
// to call the builtin resource tools.
let mcp_client = Arc::new(RemoteMcpClient::new()?);
// Initialise the MCP LLM client it implements Provider and talks to the
// MCP LLM server which wraps Ollama. This ensures all communication goes
// through the MCP architecture (Phase 10 requirement).
let provider = Arc::new(RemoteMcpClient::new()?);
// The MCP client also serves as the tool client for resource operations
let mcp_client = Arc::clone(&provider) as Arc<RemoteMcpClient>;
let config = AgentConfig {
max_iterations: args.max_iter,