Integrate core functionality for tools, MCP, and enhanced session management

Adds consent management for tool execution, input validation, sandboxed process execution, and MCP server integration. Updates session management to support tool use, conversation persistence, and streaming responses.

Major additions:
- Database migrations for conversations and secure storage
- Encryption and credential management infrastructure
- Extensible tool system with code execution and web search
- Consent management and validation systems
- Sandboxed process execution
- MCP server integration

Infrastructure changes:
- Module registration and workspace dependencies
- ToolCall type and tool-related Message methods
- Privacy, security, and tool configuration structures
- Database-backed conversation persistence
- Tool call tracking in conversations

Provider and UI updates:
- Ollama provider updates for tool support and new Role types
- TUI chat and code app updates for async initialization
- CLI updates for new SessionController API
- Configuration documentation updates
- CHANGELOG updates

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
This commit is contained in:
2025-10-06 18:36:42 +02:00
parent 9c777c8429
commit 235f84fa19
24 changed files with 4734 additions and 1549 deletions

View File

@@ -2,7 +2,7 @@
use anyhow::Result;
use clap::{Arg, Command};
use owlen_core::session::SessionController;
use owlen_core::{session::SessionController, storage::StorageManager};
use owlen_ollama::OllamaProvider;
use owlen_tui::{config, ui, AppState, ChatApp, Event, EventHandler, SessionEvent};
use std::io;
@@ -38,14 +38,27 @@ async fn main() -> Result<()> {
}
// Prepare provider from configuration
let provider_cfg = config::ensure_ollama_config(&mut config).clone();
let provider_name = config.general.default_provider.clone();
let provider_cfg = config::ensure_provider_config(&mut config, &provider_name).clone();
let provider_type = provider_cfg.provider_type.to_ascii_lowercase();
if provider_type != "ollama" && provider_type != "ollama-cloud" {
anyhow::bail!(
"Unsupported provider type '{}' configured for provider '{}'",
provider_cfg.provider_type,
provider_name
);
}
let provider = Arc::new(OllamaProvider::from_config(
&provider_cfg,
Some(&config.general),
)?);
let controller = SessionController::new(provider, config.clone());
let (mut app, mut session_rx) = ChatApp::new(controller);
let storage = Arc::new(StorageManager::new().await?);
// Chat client - code execution tools disabled (only available in code client)
let controller = SessionController::new(provider, config.clone(), storage.clone(), false)?;
let (mut app, mut session_rx) = ChatApp::new(controller).await?;
app.initialize_models().await?;
// Event infrastructure
@@ -104,7 +117,14 @@ async fn run_app(
terminal.draw(|f| ui::render_chat(f, app))?;
// Process any pending LLM requests AFTER UI has been drawn
app.process_pending_llm_request().await?;
if let Err(e) = app.process_pending_llm_request().await {
eprintln!("Error processing LLM request: {}", e);
}
// Process any pending tool executions AFTER UI has been drawn
if let Err(e) = app.process_pending_tool_execution().await {
eprintln!("Error processing tool execution: {}", e);
}
tokio::select! {
Some(event) = event_rx.recv() => {