Plan Execution System: - Add PlanStep, AccumulatedPlan types for multi-turn tool call accumulation - Implement AccumulatedPlanStatus for tracking plan lifecycle - Support selective approval of proposed tool calls before execution External Tools Integration: - Add ExternalToolDefinition and ExternalToolTransport to plugins crate - Extend ToolContext with external_tools registry - Add external_tool_to_llm_tool conversion for LLM compatibility JSON-RPC Communication: - Add jsonrpc crate for JSON-RPC 2.0 protocol support - Enable stdio-based communication with external tool servers UI & Engine Updates: - Add plan_panel.rs component for displaying accumulated plans - Wire plan mode into engine loop - Add plan mode integration tests 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com>
Owlen CLI
The command-line interface for the Owlen AI agent.
Features
- Interactive Chat: Communicate with the AI agent directly from your terminal.
- Tool Integration: Built-in support for filesystem operations, bash execution, and more.
- Provider Management: Easily switch between different LLM providers (Ollama, Anthropic, OpenAI).
- Session Management: Persist conversation history and resume previous sessions.
- Secure Authentication: Managed authentication flows for major AI providers.
Usage
Direct Invocation
# Start an interactive chat session
owlen
# Ask a single question
owlen "How do I list files in Rust?"
Commands
owlen config: View or modify agent configuration.owlen login <provider>: Authenticate with a specific LLM provider.owlen session: Manage chat sessions.
Configuration
Owlen uses a global configuration file located at ~/.config/owlen/config.toml. You can also provide project-specific settings via an .owlen.toml file in your project root.