From f39c7a75f2edc668c9ab12af356b3518092d7918 Mon Sep 17 00:00:00 2001 From: vikingowl Date: Fri, 26 Dec 2025 18:29:22 +0100 Subject: [PATCH] docs(app): Add README.md for CLI and UI crates --- crates/app/cli/README.md | 29 +++++++++++++++++++++++++++++ crates/app/ui/README.md | 18 ++++++++++++++++++ 2 files changed, 47 insertions(+) create mode 100644 crates/app/cli/README.md create mode 100644 crates/app/ui/README.md diff --git a/crates/app/cli/README.md b/crates/app/cli/README.md new file mode 100644 index 0000000..55cea5c --- /dev/null +++ b/crates/app/cli/README.md @@ -0,0 +1,29 @@ +# Owlen CLI + +The command-line interface for the Owlen AI agent. + +## Features +- **Interactive Chat:** Communicate with the AI agent directly from your terminal. +- **Tool Integration:** Built-in support for filesystem operations, bash execution, and more. +- **Provider Management:** Easily switch between different LLM providers (Ollama, Anthropic, OpenAI). +- **Session Management:** Persist conversation history and resume previous sessions. +- **Secure Authentication:** Managed authentication flows for major AI providers. + +## Usage + +### Direct Invocation +```bash +# Start an interactive chat session +owlen + +# Ask a single question +owlen "How do I list files in Rust?" +``` + +### Commands +- `owlen config`: View or modify agent configuration. +- `owlen login `: Authenticate with a specific LLM provider. +- `owlen session`: Manage chat sessions. + +## Configuration +Owlen uses a global configuration file located at `~/.config/owlen/config.toml`. You can also provide project-specific settings via an `.owlen.toml` file in your project root. diff --git a/crates/app/ui/README.md b/crates/app/ui/README.md new file mode 100644 index 0000000..182259b --- /dev/null +++ b/crates/app/ui/README.md @@ -0,0 +1,18 @@ +# Owlen UI + +A Terminal User Interface (TUI) for the Owlen AI agent, built with Ratatui. + +## Features +- **Rich Text Rendering:** Markdown support with syntax highlighting for code blocks. +- **Interactive Components:** Intuitive panels for chat, tool execution, and session status. +- **Real-time Streaming:** Smooth display of agent output as it's generated. +- **Task Visualization:** Dedicated view for tracking the agent's progress through a task list. + +## Architecture +The UI is built using an event-driven architecture integrated with the `agent-core` event stream. It leverages `ratatui` for terminal rendering and `crossterm` for event handling. + +## Components +- `ChatPanel`: Displays the conversation history. +- `TaskPanel`: Shows the current implementation plan and task status. +- `ToolPanel`: Visualizes active tool executions and their output. +- `ModelPicker`: Allows selecting between available LLM providers and models.