Migrate all remaining collapsible_if patterns to Rust 2024 let-chain syntax across the entire codebase. This modernizes conditional logic by replacing nested if statements with single-level expressions using the && operator with let patterns. Changes: - storage.rs: 2 let-chain conversions (database dir creation, legacy archiving) - session.rs: 3 let-chain conversions (empty content check, ledger dir creation, consent flow) - ollama.rs: 8 let-chain conversions (socket parsing, cloud validation, model caching, capabilities) - main.rs: 2 let-chain conversions (API key validation, provider enablement) - owlen-tui: ~50 let-chain conversions across app/mod.rs, chat_app.rs, ui.rs, highlight.rs, and state modules Test fixes: - prompt_server.rs: Add missing .await on async RemoteMcpClient::new_with_config - presets.rs, prompt_server.rs: Add missing rpc_timeout_secs field to McpServerConfig - file_write.rs: Update error assertion to accept new "escapes workspace boundary" message Verification: - cargo build --all: ✅ succeeds - cargo clippy --all -- -D clippy::collapsible_if: ✅ zero warnings - cargo test --all: ✅ 109+ tests pass Net result: -46 lines of code, improved readability and maintainability. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com>
OWLEN
Terminal-native assistant for running local language models with a comfortable TUI.
What Is OWLEN?
OWLEN is a Rust-powered, terminal-first interface for interacting with local and cloud language models. It provides a responsive chat workflow that now routes through a multi-provider manager—handling local Ollama, Ollama Cloud, and future MCP-backed providers— with a focus on developer productivity, vim-style navigation, and seamless session management—all without leaving your terminal.
Alpha Status
This project is currently in alpha and under active development. Core features are functional, but expect occasional bugs and breaking changes. Feedback, bug reports, and contributions are very welcome!
Screenshots
The refreshed chrome introduces a cockpit-style header with live gradient gauges for context and cloud usage, plus glassy panels that keep vim-inspired navigation easy to follow. See more screenshots in the images/ directory.
Features
- Vim-style Navigation: Normal, editing, visual, and command modes.
- Streaming Responses: Real-time token streaming from Ollama.
- Advanced Text Editing: Multi-line input, history, and clipboard support.
- Session Management: Save, load, and manage conversations.
- Code Side Panel: Switch to code mode (
:mode code) and open files inline with:open <path>for LLM-assisted coding. - Cockpit Header: Gradient context and cloud usage bars with live quota bands and provider fallbacks.
- Theming System: 10 built-in themes and support for custom themes.
- Modular Architecture: Extensible provider system orchestrated by the new
ProviderManager, ready for additional MCP-backed providers. - Dual-Source Model Picker: Merge local and cloud catalogues with real-time availability badges powered by the background health worker.
- Non-Blocking UI Loop: Asynchronous generation tasks and provider health checks run off-thread, keeping the TUI responsive even while streaming long replies.
- Guided Setup:
owlen config doctorupgrades legacy configs and verifies your environment in seconds.
Repository Automation
Owlen now ships with Git-aware automation helpers so you can review code and stage commits without leaving the terminal:
- CLI –
owlen repo commit-templaterenders a conventional commit scaffolding from the staged diff (--working-treeinspects unstaged changes), whileowlen repo reviewsummarises the current branch or a GitHub pull request. Provide--owner,--repo, and--numberto fetch remote diffs; the command picks up credentials fromGITHUB_TOKEN(override with--token-envor--token). - TUI –
:repo templateinjects the generated template into the conversation stream, and:repo review [--base BRANCH] [--head REF]produces a Markdown review of local changes. The results appear as system messages so you can follow up with an LLM turn or copy them directly into a GitHub comment. - Automation APIs – Under the hood,
owlen-core::automation::repoexposes reusable builders (RepoAutomation,CommitTemplate,PullRequestReview) that mirror the Claude Code workflow style. They provide JSON-serialisable checklists, workflow steps, and heuristics that highlight risky changes (e.g., newunwrap()calls, uncheckedunsafeblocks, or absent tests).
Add a personal access token with repo scope to unlock GitHub diff fetching. Enterprise installations can point at a custom API host with the --api-endpoint flag.
Upgrading to v0.2
- Local + Cloud resiliency: Owlen now distinguishes the on-device daemon from Ollama Cloud and gracefully falls back to local if the hosted key is missing or unauthorized. Cloud requests include
Authorization: Bearer <API_KEY>and reuse the canonicalhttps://ollama.combase URL so you no longer hit 401 loops. - Context + quota cockpit: The header shows
context used / window (percentage)and a second gauge for hourly/weekly cloud token usage. Configure soft limits viaproviders.ollama_cloud.hourly_quota_tokensandweekly_quota_tokens; Owlen tracks consumption locally even when the provider omits token counters. - Web search tooling: When cloud is enabled, models can call the spec-compliant
web_searchtool automatically. Toggle availability at runtime with:web on/:web offif you need a local-only session. - Docs & config parity: Ship-ready config templates now include per-provider
list_ttl_secsanddefault_context_windowvalues, plus explicitOLLAMA_API_KEYguidance. Runowlen config doctorafter upgrading from v0.1 to normalize legacy keys and receive deprecation warnings forOLLAMA_CLOUD_API_KEYandOWLEN_OLLAMA_CLOUD_API_KEY. - Runtime toggles: Use
:web on/:web offin the TUI orowlen providers web --enable/--disablefrom the CLI to expose or hide theweb_searchtool without editingconfig.toml.
MCP Naming & Reference Bundles
Owlen enforces spec-compliant tool identifiers: stick to ^[A-Za-z0-9_-]{1,64}$, avoid dotted names, and keep identifiers short so the host can qualify them when multiple servers are present.citeturn11search0 Define your tools with underscores or hyphens (for example, web_search, filesystem_read, notion_query) and treat any legacy dotted forms as incompatible.
Modern MCP hosts converge on a common bundle of connectors that cover three broad categories: local operations (filesystem, terminal, git, structured HTTP fetch, browser automation), compute sandboxes (Python, notebook adapters, sequential-thinking planners, test runners), and SaaS integrations (GitHub issues, Notion workspaces, Slack, Stripe, Sentry, Google Drive, Zapier-style automation, design system search).citeturn12search3turn12search10 Owlen’s configuration examples mirror that baseline so a fresh install can wire up the same capabilities without additional mapping.
To replicate the reference bundle today:
- Enable the built-in tools that ship with Owlen (
web_search, filesystem resource APIs, execution sandboxes). - Add external servers under
[mcp_servers], keeping names spec-compliant (e.g.,filesystem,terminal,git,browser,http_fetch,python,notebook,sequential_thinking,sentry,notion,slack,stripe,google_drive,memory_bank,automation_hub). - Qualify tool identifiers in prompts and configs using the
{server}__{tool}pattern once multiple servers contribute overlapping operations (filesystem__read,browser__request,notion__query_database).
See the updated MCP guide in docs/ for detailed installation commands, environment variables, and health checks for each connector. The documentation set below walks through configuration and runtime toggles for web_search and the rest of the reference bundle.
Security & Privacy
Owlen is designed to keep data local by default while still allowing controlled access to remote tooling.
- Local-first execution: All LLM calls flow through the bundled MCP LLM server which talks to a local Ollama instance. If the server is unreachable, Owlen stays usable in “offline mode” and surfaces clear recovery instructions.
- Sandboxed tooling: Code execution runs in Docker according to the MCP Code Server settings, and future releases will extend this to other OS-level sandboxes (
sandbox-execon macOS, Windows job objects). - Session storage: Conversations are stored under the platform data directory and can be encrypted at rest. Set
privacy.encrypt_local_data = trueinconfig.tomlto enable AES-GCM storage backed by an Owlen-managed secret key—no passphrase entry required. - Network access: No telemetry is sent. The only outbound requests occur when you explicitly enable remote tooling (e.g., web search) or configure a cloud LLM provider. Each tool is opt-in via
privacyandtoolsconfiguration sections. - Config migrations: Every saved
config.tomlcarries a schema version and is upgraded automatically; deprecated keys trigger warnings so security-related settings are not silently ignored.
Getting Started
Prerequisites
- Rust 1.75+ and Cargo.
- A running Ollama instance.
- A terminal that supports 256 colors.
Installation
Pick the option that matches your platform and appetite for source builds:
| Platform | Package / Command | Notes |
|---|---|---|
| Arch Linux | yay -S owlen-git |
Builds from the latest dev branch via AUR. |
| Other Linux | cargo install --path crates/owlen-cli --locked --force |
Requires Rust 1.75+ and a running Ollama daemon. |
| macOS | cargo install --path crates/owlen-cli --locked --force |
macOS 12+ tested. Install Ollama separately (brew install ollama). The binary links against the system OpenSSL – ensure Command Line Tools are installed. |
| Windows (experimental) | cargo install --path crates/owlen-cli --locked --force |
Enable the GNU toolchain (rustup target add x86_64-pc-windows-gnu) and install Ollama for Windows preview builds. Some optional tools (e.g., Docker-based code execution) are currently disabled. |
If you prefer containerised builds, use the provided Dockerfile as a base image and copy out target/release/owlen.
Run the helper scripts to sanity-check platform coverage:
# Windows compatibility smoke test (GNU toolchain)
scripts/check-windows.sh
# Reproduce CI packaging locally (choose a target from .woodpecker.yml)
dev/local_build.sh x86_64-unknown-linux-gnu
Tip (macOS): On the first launch macOS Gatekeeper may quarantine the binary. Clear the attribute (
xattr -d com.apple.quarantine $(which owlen)) or build from source locally to avoid notarisation prompts.
Running OWLEN
Make sure Ollama is running, then launch the application:
owlen
If you built from source without installing, you can run it with:
./target/release/owlen
Updating
Owlen does not auto-update. Run owlen upgrade at any time to print the recommended manual steps (pull the repository and reinstall with cargo install --path crates/owlen-cli --force). Arch Linux users can update via the owlen-git AUR package.
Using the TUI
OWLEN uses a modal, vim-inspired interface. Press F1 (available from any mode) or ? in Normal mode to view the help screen with all keybindings.
- Normal Mode: Navigate with
h/j/k/l,w/b,gg/G. - Editing Mode: Enter with
iora. Send messages withEnter. - Command Mode: Enter with
:. Access commands like:quit,:w,:session save,:theme. - Quick Exit: Press
Ctrl+Ctwice in Normal mode to quit quickly (first press still cancels active generations). - Tutorial Command: Type
:tutorialany time for a quick summary of the most important keybindings. - MCP Slash Commands: Owlen auto-registers zero-argument MCP tools as slash commands—type
/mcp__github__list_prs(for example) to pull remote context directly into the chat log.
Keymaps
Two built-in keymaps ship with Owlen:
vim(default) – the existing modal bindings documented above.emacs– bindings centred aroundAlt+X,Ctrl+Space, andAlt+Oshortcuts with Emacs-style submit (Ctrl+Enter).
Switch at runtime with :keymap vim or :keymap emacs. Persist your choice by setting ui.keymap_profile = "emacs" (or "vim") in config.toml. If you prefer a fully custom layout, point ui.keymap_path at a TOML file using the same format as crates/owlen-tui/keymap.toml; the new emacs profile file crates/owlen-tui/keymap_emacs.toml is a useful template.
Model discovery commands worth remembering:
:models --localor:models --cloudjump directly to the corresponding section in the picker.:cloud setup [--force-cloud-base-url]stores your cloud API key without clobbering an existing local base URL (unless you opt in with the flag).:limitsprints the locally tracked hourly/weekly token totals for each provider and mirrors the values shown in the chat header. When a catalogue is unreachable, Owlen now tags the picker withLocal unavailable/Cloud unavailableso you can recover without guessing.
Documentation
For more detailed information, please refer to the following documents:
- CONTRIBUTING.md: Guidelines for contributing to the project.
- CHANGELOG.md: A log of changes for each version.
- docs/architecture.md: An overview of the project's architecture.
- docs/troubleshooting.md: Help with common issues.
- docs/repo-map.md: Snapshot of the workspace layout and key crates.
- docs/provider-implementation.md: Trait-level details for implementing providers.
- docs/adding-providers.md: Step-by-step checklist for wiring a provider into the multi-provider architecture and test suite.
- docs/tui-ux-playbook.md: Design principles, modal ergonomics, and keybinding guidance for the TUI.
- Experimental providers staging area: crates/providers/experimental/README.md records the placeholder crates (OpenAI, Anthropic, Gemini) and their current status.
- docs/platform-support.md: Current OS support matrix and cross-check instructions.
Developer Tasks
cargo xtask screenshotsregenerates deterministic ANSI dumps (and, whenchafais available, PNG renders) for the documentation gallery. Use--no-pngto skip the PNG step or--output <dir>to redirect the output.
Conversation Compression
Owlen automatically compacts older turns once a chat crosses the configured
token threshold. The behaviour is controlled by the [chat] section in
config.toml (enabled by default via chat.auto_compress = true).
- Launch the TUI with
--no-auto-compressto opt out for a single run. - Inside the app,
:compress nowgenerates an on-demand summary, while:compress auto on|offflips the automatic mode and persists the change. - Each compression pass emits a system summary that carries metadata about the retained messages, strategy, and estimated token savings.
Configuration
OWLEN stores its configuration in the standard platform-specific config directory:
| Platform | Location |
|---|---|
| Linux | ~/.config/owlen/config.toml |
| macOS | ~/Library/Application Support/owlen/config.toml |
| Windows | %APPDATA%\owlen\config.toml |
Use owlen config init to scaffold a fresh configuration (pass --force to overwrite an existing file), owlen config path to print the resolved location, and owlen config doctor to migrate legacy layouts automatically.
You can also add custom themes alongside the config directory (e.g., ~/.config/owlen/themes/).
See the themes/README.md for more details on theming.
Testing
Owlen uses standard Rust tooling for verification. Run the full test suite with:
cargo test
Unit tests cover the command palette state machine, agent response parsing, and key MCP abstractions. Formatting and lint checks can be run with cargo fmt --all and cargo clippy respectively.
Roadmap
Upcoming milestones focus on feature parity with modern code assistants while keeping Owlen local-first:
- Phase 11 – MCP client enhancements:
owlen mcp add/list/remove, resource references (@github:issue://123), and MCP prompt slash commands. - Phase 12 – Approval & sandboxing: Three-tier approval modes plus platform-specific sandboxes (Docker,
sandbox-exec, Windows job objects). - Phase 13 – Project documentation system: Automatic
OWLEN.mdgeneration, contextual updates, and nested project support. - Phase 15 – Provider expansion: OpenAI, Anthropic, and other cloud providers layered onto the existing Ollama-first architecture.
See AGENTS.md for the long-form roadmap and design notes.
Contributing
Contributions are highly welcome! Please see our Contributing Guide for details on how to get started, including our code style, commit conventions, and pull request process.
License
This project is licensed under the GNU Affero General Public License v3.0. See the LICENSE file for details. For commercial or proprietary integrations that cannot adopt AGPL, please reach out to the maintainers to discuss alternative licensing arrangements.
