This commit completes Phase 10 of the MCP migration by removing all
direct provider usage from CLI/TUI and enforcing MCP-first architecture.
## Changes
### Core Architecture
- **main.rs**: Replaced OllamaProvider with RemoteMcpClient
- Uses MCP server configuration from config.toml if available
- Falls back to auto-discovery of MCP LLM server binary
- **agent_main.rs**: Unified provider and MCP client to single RemoteMcpClient
- Simplifies initialization with Arc::clone pattern
- All LLM communication now goes through MCP protocol
### Dependencies
- **Cargo.toml**: Removed owlen-ollama dependency from owlen-cli
- CLI no longer knows about Ollama implementation details
- Clean separation: only MCP servers use provider crates internally
### Tests
- **agent_tests.rs**: Updated all tests to use RemoteMcpClient
- Replaced OllamaProvider::new() with RemoteMcpClient::new()
- Updated test documentation to reflect MCP requirements
- All tests compile and run successfully
### Examples
- **Removed**: custom_provider.rs, basic_chat.rs (deprecated)
- **Added**: mcp_chat.rs - demonstrates recommended MCP-based usage
- Shows how to use RemoteMcpClient for LLM interactions
- Includes model listing and chat request examples
### Cleanup
- Removed outdated TODO about MCP integration (now complete)
- Updated comments to reflect current MCP architecture
## Architecture
```
CLI/TUI → RemoteMcpClient (impl Provider)
↓ MCP Protocol (STDIO/HTTP/WS)
MCP LLM Server → OllamaProvider → Ollama
```
## Benefits
- ✅ Clean separation of concerns
- ✅ CLI is protocol-agnostic (only knows MCP)
- ✅ Easier to add new LLM backends (just implement MCP server)
- ✅ All tests passing
- ✅ Full workspace builds successfully
🤖 Generated with [Claude Code](https://claude.com/claude-code)
Co-Authored-By: Claude <noreply@anthropic.com>
62 lines
1.9 KiB
Rust
62 lines
1.9 KiB
Rust
//! Simple entry point for the ReAct agentic executor.
|
||
//!
|
||
//! Usage: `owlen-agent "<prompt>" [--model <model>] [--max-iter <n>]`
|
||
//!
|
||
//! This binary demonstrates Phase 4 without the full TUI. It creates an
|
||
//! OllamaProvider, a RemoteMcpClient, runs the AgentExecutor and prints the
|
||
//! final answer.
|
||
|
||
use std::sync::Arc;
|
||
|
||
use clap::Parser;
|
||
use owlen_cli::agent::{AgentConfig, AgentExecutor};
|
||
use owlen_core::mcp::remote_client::RemoteMcpClient;
|
||
|
||
/// Command‑line arguments for the agent binary.
|
||
#[derive(Parser, Debug)]
|
||
#[command(
|
||
name = "owlen-agent",
|
||
author,
|
||
version,
|
||
about = "Run the ReAct agent via MCP"
|
||
)]
|
||
struct Args {
|
||
/// The initial user query.
|
||
prompt: String,
|
||
/// Model to use (defaults to Ollama default).
|
||
#[arg(long)]
|
||
model: Option<String>,
|
||
/// Maximum ReAct iterations.
|
||
#[arg(long, default_value_t = 10)]
|
||
max_iter: usize,
|
||
}
|
||
|
||
#[tokio::main]
|
||
async fn main() -> anyhow::Result<()> {
|
||
let args = Args::parse();
|
||
|
||
// Initialise the MCP LLM client – it implements Provider and talks to the
|
||
// MCP LLM server which wraps Ollama. This ensures all communication goes
|
||
// through the MCP architecture (Phase 10 requirement).
|
||
let provider = Arc::new(RemoteMcpClient::new()?);
|
||
|
||
// The MCP client also serves as the tool client for resource operations
|
||
let mcp_client = Arc::clone(&provider) as Arc<RemoteMcpClient>;
|
||
|
||
let config = AgentConfig {
|
||
max_iterations: args.max_iter,
|
||
model: args.model.unwrap_or_else(|| "llama3.2:latest".to_string()),
|
||
..AgentConfig::default()
|
||
};
|
||
|
||
let executor = AgentExecutor::new(provider, mcp_client, config);
|
||
match executor.run(args.prompt).await {
|
||
Ok(result) => {
|
||
println!("\n✓ Agent completed in {} iterations", result.iterations);
|
||
println!("\nFinal answer:\n{}", result.answer);
|
||
Ok(())
|
||
}
|
||
Err(e) => Err(anyhow::anyhow!(e)),
|
||
}
|
||
}
|