From f7aac0785b8f42f90cca17e7b60f1c8d3562d304 Mon Sep 17 00:00:00 2001 From: vikingowl Date: Fri, 26 Dec 2025 18:30:05 +0100 Subject: [PATCH] docs(llm): Add README.md for all LLM crates --- crates/llm/anthropic/README.md | 15 +++++++++++++++ crates/llm/core/README.md | 18 ++++++++++++++++++ crates/llm/ollama/README.md | 14 ++++++++++++++ crates/llm/openai/README.md | 14 ++++++++++++++ 4 files changed, 61 insertions(+) create mode 100644 crates/llm/anthropic/README.md create mode 100644 crates/llm/core/README.md create mode 100644 crates/llm/ollama/README.md create mode 100644 crates/llm/openai/README.md diff --git a/crates/llm/anthropic/README.md b/crates/llm/anthropic/README.md new file mode 100644 index 0000000..16f7dde --- /dev/null +++ b/crates/llm/anthropic/README.md @@ -0,0 +1,15 @@ +# Owlen Anthropic Provider + +Anthropic Claude integration for the Owlen AI agent. + +## Overview +This crate provides the implementation of the `LlmProvider` trait for Anthropic's Claude models. It handles the specific API requirements for Claude, including its unique tool calling format and streaming response structure. + +## Features +- **Claude 3.5 Sonnet Support:** Optimized for the latest high-performance models. +- **Tool Use:** Native integration with Claude's tool calling capabilities. +- **Streaming:** Efficient real-time response generation using server-sent events. +- **Token Counting:** Accurate token estimation using Anthropic-specific logic. + +## Configuration +Requires an `ANTHROPIC_API_KEY` to be set in the environment or configuration. diff --git a/crates/llm/core/README.md b/crates/llm/core/README.md new file mode 100644 index 0000000..be9f406 --- /dev/null +++ b/crates/llm/core/README.md @@ -0,0 +1,18 @@ +# Owlen LLM Core + +The core abstraction layer for Large Language Model (LLM) providers in the Owlen AI agent. + +## Overview +This crate defines the common traits and types used to integrate various AI providers. It enables the agent to be model-agnostic and switch between different backends at runtime. + +## Key Components +- `LlmProvider` Trait: The primary interface that all provider implementations (Anthropic, OpenAI, Ollama) must satisfy. +- `ChatMessage`: Unified message structure for conversation history. +- `StreamChunk`: Standardized format for streaming LLM responses. +- `ToolCall`: Abstraction for tool invocation requests from the model. +- `TokenCounter`: Utilities for estimating token usage and managing context windows. + +## Supported Providers +- **Anthropic:** Integration with Claude 3.5 models. +- **OpenAI:** Integration with GPT-4o models. +- **Ollama:** Support for local models (e.g., Llama 3, Qwen). diff --git a/crates/llm/ollama/README.md b/crates/llm/ollama/README.md new file mode 100644 index 0000000..ca6a385 --- /dev/null +++ b/crates/llm/ollama/README.md @@ -0,0 +1,14 @@ +# Owlen Ollama Provider + +Local LLM integration via Ollama for the Owlen AI agent. + +## Overview +This crate enables the Owlen agent to use local models running via Ollama. This is ideal for privacy-focused workflows or development without an internet connection. + +## Features +- **Local Execution:** No API keys required for basic local use. +- **Llama 3 / Qwen Support:** Compatible with popular open-source models. +- **Custom Model URLs:** Connect to Ollama instances running on non-standard ports or remote servers. + +## Configuration +Requires a running Ollama instance. The default connection URL is `http://localhost:11434`. diff --git a/crates/llm/openai/README.md b/crates/llm/openai/README.md new file mode 100644 index 0000000..22ef2d8 --- /dev/null +++ b/crates/llm/openai/README.md @@ -0,0 +1,14 @@ +# Owlen OpenAI Provider + +OpenAI GPT integration for the Owlen AI agent. + +## Overview +This crate provides the implementation of the `LlmProvider` trait for OpenAI's models (e.g., GPT-4o). It supports the standard OpenAI chat completion API, including tool calling and streaming. + +## Features +- **GPT-4o Support:** Reliable integration with flagship OpenAI models. +- **Function Calling:** Full support for OpenAI's tool/function calling mechanism. +- **Streaming:** Robust real-time response generation. + +## Configuration +Requires an `OPENAI_API_KEY` to be set in the environment or configuration.