Adds consent management for tool execution, input validation, sandboxed process execution, and MCP server integration. Updates session management to support tool use, conversation persistence, and streaming responses. Major additions: - Database migrations for conversations and secure storage - Encryption and credential management infrastructure - Extensible tool system with code execution and web search - Consent management and validation systems - Sandboxed process execution - MCP server integration Infrastructure changes: - Module registration and workspace dependencies - ToolCall type and tool-related Message methods - Privacy, security, and tool configuration structures - Database-backed conversation persistence - Tool call tracking in conversations Provider and UI updates: - Ollama provider updates for tool support and new Role types - TUI chat and code app updates for async initialization - CLI updates for new SessionController API - Configuration documentation updates - CHANGELOG updates 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com>
Owlen Ollama
This crate provides an implementation of the owlen-core::Provider trait for the Ollama backend.
It allows Owlen to communicate with a local Ollama instance, sending requests and receiving responses from locally-run large language models. You can also target Ollama Cloud by pointing the provider at https://ollama.com (or https://api.ollama.com) and providing an API key through your Owlen configuration (or the OLLAMA_API_KEY / OLLAMA_CLOUD_API_KEY environment variables). The client automatically adds the required Bearer authorization header when a key is supplied, accepts either host without rewriting, and expands inline environment references like $OLLAMA_API_KEY if you prefer not to check the secret into your config file. The generated configuration now includes both providers.ollama and providers.ollama-cloud entries—switch between them by updating general.default_provider.
Configuration
To use this provider, you need to have Ollama installed and running. The default address is http://localhost:11434. You can configure this in your config.toml if your Ollama instance is running elsewhere.