# OWLEN > Terminal-native assistant for running local language models with a comfortable TUI. ![Status](https://img.shields.io/badge/status-alpha-yellow) ![Version](https://img.shields.io/badge/version-0.1.11-blue) ![Rust](https://img.shields.io/badge/made_with-Rust-ffc832?logo=rust&logoColor=white) ![License](https://img.shields.io/badge/license-AGPL--3.0-blue) ## What Is OWLEN? OWLEN is a Rust-powered, terminal-first interface for interacting with local and cloud language models. It provides a responsive chat workflow that now routes through a multi-provider manager—handling local Ollama, Ollama Cloud, and future MCP-backed providers— with a focus on developer productivity, vim-style navigation, and seamless session management—all without leaving your terminal. ## Alpha Status This project is currently in **alpha** and under active development. Core features are functional, but expect occasional bugs and breaking changes. Feedback, bug reports, and contributions are very welcome! ## Screenshots ![OWLEN TUI Layout](images/layout.png) The OWLEN interface features a clean, multi-panel layout with vim-inspired navigation. See more screenshots in the [`images/`](images/) directory. ## Features - **Vim-style Navigation**: Normal, editing, visual, and command modes. - **Streaming Responses**: Real-time token streaming from Ollama. - **Advanced Text Editing**: Multi-line input, history, and clipboard support. - **Session Management**: Save, load, and manage conversations. - **Code Side Panel**: Switch to code mode (`:mode code`) and open files inline with `:open ` for LLM-assisted coding. - **Theming System**: 10 built-in themes and support for custom themes. - **Modular Architecture**: Extensible provider system orchestrated by the new `ProviderManager`, ready for additional MCP-backed providers. - **Dual-Source Model Picker**: Merge local and cloud catalogues with real-time availability badges powered by the background health worker. - **Non-Blocking UI Loop**: Asynchronous generation tasks and provider health checks run off-thread, keeping the TUI responsive even while streaming long replies. - **Guided Setup**: `owlen config doctor` upgrades legacy configs and verifies your environment in seconds. ## Security & Privacy Owlen is designed to keep data local by default while still allowing controlled access to remote tooling. - **Local-first execution**: All LLM calls flow through the bundled MCP LLM server which talks to a local Ollama instance. If the server is unreachable, Owlen stays usable in “offline mode” and surfaces clear recovery instructions. - **Sandboxed tooling**: Code execution runs in Docker according to the MCP Code Server settings, and future releases will extend this to other OS-level sandboxes (`sandbox-exec` on macOS, Windows job objects). - **Session storage**: Conversations are stored under the platform data directory and can be encrypted at rest. Set `privacy.encrypt_local_data = true` in `config.toml` to enable AES-GCM storage protected by a user-supplied passphrase. - **Network access**: No telemetry is sent. The only outbound requests occur when you explicitly enable remote tooling (e.g., web search) or configure a cloud LLM provider. Each tool is opt-in via `privacy` and `tools` configuration sections. - **Config migrations**: Every saved `config.toml` carries a schema version and is upgraded automatically; deprecated keys trigger warnings so security-related settings are not silently ignored. ## Getting Started ### Prerequisites - Rust 1.75+ and Cargo. - A running Ollama instance. - A terminal that supports 256 colors. ### Installation Pick the option that matches your platform and appetite for source builds: | Platform | Package / Command | Notes | | --- | --- | --- | | Arch Linux | `yay -S owlen-git` | Builds from the latest `dev` branch via AUR. | | Other Linux | `cargo install --path crates/owlen-cli --locked --force` | Requires Rust 1.75+ and a running Ollama daemon. | | macOS | `cargo install --path crates/owlen-cli --locked --force` | macOS 12+ tested. Install Ollama separately (`brew install ollama`). The binary links against the system OpenSSL – ensure Command Line Tools are installed. | | Windows (experimental) | `cargo install --path crates/owlen-cli --locked --force` | Enable the GNU toolchain (`rustup target add x86_64-pc-windows-gnu`) and install Ollama for Windows preview builds. Some optional tools (e.g., Docker-based code execution) are currently disabled. | If you prefer containerised builds, use the provided `Dockerfile` as a base image and copy out `target/release/owlen`. Run the helper scripts to sanity-check platform coverage: ```bash # Windows compatibility smoke test (GNU toolchain) scripts/check-windows.sh # Reproduce CI packaging locally (choose a target from .woodpecker.yml) dev/local_build.sh x86_64-unknown-linux-gnu ``` > **Tip (macOS):** On the first launch macOS Gatekeeper may quarantine the binary. Clear the attribute (`xattr -d com.apple.quarantine $(which owlen)`) or build from source locally to avoid notarisation prompts. ### Running OWLEN Make sure Ollama is running, then launch the application: ```bash owlen ``` If you built from source without installing, you can run it with: ```bash ./target/release/owlen ``` ### Updating Owlen does not auto-update. Run `owlen upgrade` at any time to print the recommended manual steps (pull the repository and reinstall with `cargo install --path crates/owlen-cli --force`). Arch Linux users can update via the `owlen-git` AUR package. ## Using the TUI OWLEN uses a modal, vim-inspired interface. Press `F1` (available from any mode) or `?` in Normal mode to view the help screen with all keybindings. - **Normal Mode**: Navigate with `h/j/k/l`, `w/b`, `gg/G`. - **Editing Mode**: Enter with `i` or `a`. Send messages with `Enter`. - **Command Mode**: Enter with `:`. Access commands like `:quit`, `:w`, `:session save`, `:theme`. - **Quick Exit**: Press `Ctrl+C` twice in Normal mode to quit quickly (first press still cancels active generations). - **Tutorial Command**: Type `:tutorial` any time for a quick summary of the most important keybindings. - **MCP Slash Commands**: Owlen auto-registers zero-argument MCP tools as slash commands—type `/mcp__github__list_prs` (for example) to pull remote context directly into the chat log. Model discovery commands worth remembering: - `:models --local` or `:models --cloud` jump directly to the corresponding section in the picker. - `:cloud setup [--force-cloud-base-url]` stores your cloud API key without clobbering an existing local base URL (unless you opt in with the flag). When a catalogue is unreachable, Owlen now tags the picker with `Local unavailable` / `Cloud unavailable` so you can recover without guessing. ## Documentation For more detailed information, please refer to the following documents: - **[CONTRIBUTING.md](CONTRIBUTING.md)**: Guidelines for contributing to the project. - **[CHANGELOG.md](CHANGELOG.md)**: A log of changes for each version. - **[docs/architecture.md](docs/architecture.md)**: An overview of the project's architecture. - **[docs/troubleshooting.md](docs/troubleshooting.md)**: Help with common issues. - **[docs/repo-map.md](docs/repo-map.md)**: Snapshot of the workspace layout and key crates. - **[docs/provider-implementation.md](docs/provider-implementation.md)**: Trait-level details for implementing providers. - **[docs/adding-providers.md](docs/adding-providers.md)**: Step-by-step checklist for wiring a provider into the multi-provider architecture and test suite. - **Experimental providers staging area**: [crates/providers/experimental/README.md](crates/providers/experimental/README.md) records the placeholder crates (OpenAI, Anthropic, Gemini) and their current status. - **[docs/platform-support.md](docs/platform-support.md)**: Current OS support matrix and cross-check instructions. ## Configuration OWLEN stores its configuration in the standard platform-specific config directory: | Platform | Location | |----------|----------| | Linux | `~/.config/owlen/config.toml` | | macOS | `~/Library/Application Support/owlen/config.toml` | | Windows | `%APPDATA%\owlen\config.toml` | Use `owlen config path` to print the exact location on your machine and `owlen config doctor` to migrate a legacy config automatically. You can also add custom themes alongside the config directory (e.g., `~/.config/owlen/themes/`). See the [themes/README.md](themes/README.md) for more details on theming. ## Testing Owlen uses standard Rust tooling for verification. Run the full test suite with: ```bash cargo test ``` Unit tests cover the command palette state machine, agent response parsing, and key MCP abstractions. Formatting and lint checks can be run with `cargo fmt --all` and `cargo clippy` respectively. ## Roadmap Upcoming milestones focus on feature parity with modern code assistants while keeping Owlen local-first: 1. **Phase 11 – MCP client enhancements**: `owlen mcp add/list/remove`, resource references (`@github:issue://123`), and MCP prompt slash commands. 2. **Phase 12 – Approval & sandboxing**: Three-tier approval modes plus platform-specific sandboxes (Docker, `sandbox-exec`, Windows job objects). 3. **Phase 13 – Project documentation system**: Automatic `OWLEN.md` generation, contextual updates, and nested project support. 4. **Phase 15 – Provider expansion**: OpenAI, Anthropic, and other cloud providers layered onto the existing Ollama-first architecture. See `AGENTS.md` for the long-form roadmap and design notes. ## Contributing Contributions are highly welcome! Please see our **[Contributing Guide](CONTRIBUTING.md)** for details on how to get started, including our code style, commit conventions, and pull request process. ## License This project is licensed under the GNU Affero General Public License v3.0. See the [LICENSE](LICENSE) file for details. For commercial or proprietary integrations that cannot adopt AGPL, please reach out to the maintainers to discuss alternative licensing arrangements.