This commit completes Phase 10 of the MCP migration by removing all
direct provider usage from CLI/TUI and enforcing MCP-first architecture.
## Changes
### Core Architecture
- **main.rs**: Replaced OllamaProvider with RemoteMcpClient
- Uses MCP server configuration from config.toml if available
- Falls back to auto-discovery of MCP LLM server binary
- **agent_main.rs**: Unified provider and MCP client to single RemoteMcpClient
- Simplifies initialization with Arc::clone pattern
- All LLM communication now goes through MCP protocol
### Dependencies
- **Cargo.toml**: Removed owlen-ollama dependency from owlen-cli
- CLI no longer knows about Ollama implementation details
- Clean separation: only MCP servers use provider crates internally
### Tests
- **agent_tests.rs**: Updated all tests to use RemoteMcpClient
- Replaced OllamaProvider::new() with RemoteMcpClient::new()
- Updated test documentation to reflect MCP requirements
- All tests compile and run successfully
### Examples
- **Removed**: custom_provider.rs, basic_chat.rs (deprecated)
- **Added**: mcp_chat.rs - demonstrates recommended MCP-based usage
- Shows how to use RemoteMcpClient for LLM interactions
- Includes model listing and chat request examples
### Cleanup
- Removed outdated TODO about MCP integration (now complete)
- Updated comments to reflect current MCP architecture
## Architecture
```
CLI/TUI → RemoteMcpClient (impl Provider)
↓ MCP Protocol (STDIO/HTTP/WS)
MCP LLM Server → OllamaProvider → Ollama
```
## Benefits
- ✅ Clean separation of concerns
- ✅ CLI is protocol-agnostic (only knows MCP)
- ✅ Easier to add new LLM backends (just implement MCP server)
- ✅ All tests passing
- ✅ Full workspace builds successfully
🤖 Generated with [Claude Code](https://claude.com/claude-code)
Co-Authored-By: Claude <noreply@anthropic.com>
72 lines
2.2 KiB
Rust
72 lines
2.2 KiB
Rust
//! Example demonstrating MCP-based chat interaction.
|
|
//!
|
|
//! This example shows the recommended way to interact with LLMs via the MCP architecture.
|
|
//! It uses `RemoteMcpClient` which communicates with the MCP LLM server.
|
|
//!
|
|
//! Prerequisites:
|
|
//! - Build the MCP LLM server: `cargo build --release -p owlen-mcp-llm-server`
|
|
//! - Ensure Ollama is running with a model available
|
|
|
|
use owlen_core::{
|
|
mcp::remote_client::RemoteMcpClient,
|
|
types::{ChatParameters, ChatRequest, Message, Role},
|
|
Provider,
|
|
};
|
|
use std::sync::Arc;
|
|
|
|
#[tokio::main]
|
|
async fn main() -> Result<(), anyhow::Error> {
|
|
println!("🦉 Owlen MCP Chat Example\n");
|
|
|
|
// Create MCP client - this will spawn/connect to the MCP LLM server
|
|
println!("Connecting to MCP LLM server...");
|
|
let client = Arc::new(RemoteMcpClient::new()?);
|
|
println!("✓ Connected\n");
|
|
|
|
// List available models
|
|
println!("Fetching available models...");
|
|
let models = client.list_models().await?;
|
|
println!("Available models:");
|
|
for model in &models {
|
|
println!(" - {} ({})", model.name, model.provider);
|
|
}
|
|
println!();
|
|
|
|
// Select first available model or default
|
|
let model_name = models
|
|
.first()
|
|
.map(|m| m.id.clone())
|
|
.unwrap_or_else(|| "llama3.2:latest".to_string());
|
|
println!("Using model: {}\n", model_name);
|
|
|
|
// Create a simple chat request
|
|
let user_message = "What is the capital of France? Please be concise.";
|
|
println!("User: {}", user_message);
|
|
|
|
let request = ChatRequest {
|
|
model: model_name,
|
|
messages: vec![Message::new(Role::User, user_message.to_string())],
|
|
parameters: ChatParameters {
|
|
temperature: Some(0.7),
|
|
max_tokens: Some(100),
|
|
stream: false,
|
|
extra: std::collections::HashMap::new(),
|
|
},
|
|
tools: None,
|
|
};
|
|
|
|
// Send request and get response
|
|
println!("\nAssistant: ");
|
|
let response = client.chat(request).await?;
|
|
println!("{}", response.message.content);
|
|
|
|
if let Some(usage) = response.usage {
|
|
println!(
|
|
"\n📊 Tokens: {} prompt + {} completion = {} total",
|
|
usage.prompt_tokens, usage.completion_tokens, usage.total_tokens
|
|
);
|
|
}
|
|
|
|
Ok(())
|
|
}
|