Owlen Ollama Provider
Local LLM integration via Ollama for the Owlen AI agent.
Overview
This crate enables the Owlen agent to use local models running via Ollama. This is ideal for privacy-focused workflows or development without an internet connection.
Features
- Local Execution: No API keys required for basic local use.
- Llama 3 / Qwen Support: Compatible with popular open-source models.
- Custom Model URLs: Connect to Ollama instances running on non-standard ports or remote servers.
Configuration
Requires a running Ollama instance. The default connection URL is http://localhost:11434.