19 lines
917 B
Markdown
19 lines
917 B
Markdown
# Owlen LLM Core
|
|
|
|
The core abstraction layer for Large Language Model (LLM) providers in the Owlen AI agent.
|
|
|
|
## Overview
|
|
This crate defines the common traits and types used to integrate various AI providers. It enables the agent to be model-agnostic and switch between different backends at runtime.
|
|
|
|
## Key Components
|
|
- `LlmProvider` Trait: The primary interface that all provider implementations (Anthropic, OpenAI, Ollama) must satisfy.
|
|
- `ChatMessage`: Unified message structure for conversation history.
|
|
- `StreamChunk`: Standardized format for streaming LLM responses.
|
|
- `ToolCall`: Abstraction for tool invocation requests from the model.
|
|
- `TokenCounter`: Utilities for estimating token usage and managing context windows.
|
|
|
|
## Supported Providers
|
|
- **Anthropic:** Integration with Claude 3.5 models.
|
|
- **OpenAI:** Integration with GPT-4o models.
|
|
- **Ollama:** Support for local models (e.g., Llama 3, Qwen).
|