213 lines
15 KiB
Markdown
213 lines
15 KiB
Markdown
# OWLEN
|
||
|
||
> Terminal-native assistant for running local language models with a comfortable TUI.
|
||
|
||

|
||

|
||

|
||

|
||
|
||
## What Is OWLEN?
|
||
|
||
OWLEN is a Rust-powered, terminal-first interface for interacting with local and cloud
|
||
language models. It provides a responsive chat workflow that now routes through a
|
||
multi-provider manager—handling local Ollama, Ollama Cloud, and future MCP-backed providers—
|
||
with a focus on developer productivity, vim-style navigation, and seamless session
|
||
management—all without leaving your terminal.
|
||
|
||
## Alpha Status
|
||
|
||
This project is currently in **alpha** and under active development. Core features are functional, but expect occasional bugs and breaking changes. Feedback, bug reports, and contributions are very welcome!
|
||
|
||
## Screenshots
|
||
|
||

|
||
|
||
The refreshed chrome introduces a cockpit-style header with live gradient gauges for context and cloud usage, plus glassy panels that keep vim-inspired navigation easy to follow. See more screenshots in the [`images/`](images/) directory.
|
||
|
||
## Features
|
||
|
||
- **Vim-style Navigation**: Normal, editing, visual, and command modes.
|
||
- **Streaming Responses**: Real-time token streaming from Ollama.
|
||
- **Advanced Text Editing**: Multi-line input, history, and clipboard support.
|
||
- **Session Management**: Save, load, and manage conversations.
|
||
- **Code Side Panel**: Switch to code mode (`:mode code`) and open files inline with `:open <path>` for LLM-assisted coding.
|
||
- **Cockpit Header**: Gradient context and cloud usage bars with live quota bands and provider fallbacks.
|
||
- **Theming System**: 10 built-in themes and support for custom themes.
|
||
- **Modular Architecture**: Extensible provider system orchestrated by the new `ProviderManager`, ready for additional MCP-backed providers.
|
||
- **Dual-Source Model Picker**: Merge local and cloud catalogues with real-time availability badges powered by the background health worker.
|
||
- **Non-Blocking UI Loop**: Asynchronous generation tasks and provider health checks run off-thread, keeping the TUI responsive even while streaming long replies.
|
||
- **Guided Setup**: `owlen config doctor` upgrades legacy configs and verifies your environment in seconds.
|
||
|
||
## Upgrading to v0.2
|
||
|
||
- **Local + Cloud resiliency**: Owlen now distinguishes the on-device daemon from Ollama Cloud and gracefully falls back to local if the hosted key is missing or unauthorized. Cloud requests include `Authorization: Bearer <API_KEY>` and reuse the canonical `https://ollama.com` base URL so you no longer hit 401 loops.
|
||
- **Context + quota cockpit**: The header shows `context used / window (percentage)` and a second gauge for hourly/weekly cloud token usage. Configure soft limits via `providers.ollama_cloud.hourly_quota_tokens` and `weekly_quota_tokens`; Owlen tracks consumption locally even when the provider omits token counters.
|
||
- **Web search tooling**: When cloud is enabled, models can call the spec-compliant `web_search` tool automatically. Toggle availability at runtime with `:web on` / `:web off` if you need a local-only session.
|
||
- **Docs & config parity**: Ship-ready config templates now include per-provider `list_ttl_secs` and `default_context_window` values, plus explicit `OLLAMA_API_KEY` guidance. Run `owlen config doctor` after upgrading from v0.1 to normalize legacy keys and receive deprecation warnings for `OLLAMA_CLOUD_API_KEY` and `OWLEN_OLLAMA_CLOUD_API_KEY`.
|
||
- **Runtime toggles**: Use `:web on` / `:web off` in the TUI or `owlen providers web --enable/--disable` from the CLI to expose or hide the `web_search` tool without editing `config.toml`.
|
||
|
||
## MCP Naming & Reference Bundles
|
||
|
||
Owlen enforces spec-compliant tool identifiers: stick to `^[A-Za-z0-9_-]{1,64}$`, avoid dotted names, and keep identifiers short so the host can qualify them when multiple servers are present.citeturn11search0 Define your tools with underscores or hyphens (for example, `web_search`, `filesystem_read`, `notion_query`) and treat any legacy dotted forms as incompatible.
|
||
|
||
Modern MCP hosts converge on a common bundle of connectors that cover three broad categories: local operations (filesystem, terminal, git, structured HTTP fetch, browser automation), compute sandboxes (Python, notebook adapters, sequential-thinking planners, test runners), and SaaS integrations (GitHub issues, Notion workspaces, Slack, Stripe, Sentry, Google Drive, Zapier-style automation, design system search).citeturn12search3turn12search10 Owlen’s configuration examples mirror that baseline so a fresh install can wire up the same capabilities without additional mapping.
|
||
|
||
To replicate the reference bundle today:
|
||
|
||
1. Enable the built-in tools that ship with Owlen (`web_search`, filesystem resource APIs, execution sandboxes).
|
||
2. Add external servers under `[mcp_servers]`, keeping names spec-compliant (e.g., `filesystem`, `terminal`, `git`, `browser`, `http_fetch`, `python`, `notebook`, `sequential_thinking`, `sentry`, `notion`, `slack`, `stripe`, `google_drive`, `memory_bank`, `automation_hub`).
|
||
3. Qualify tool identifiers in prompts and configs using the `{server}__{tool}` pattern once multiple servers contribute overlapping operations (`filesystem__read`, `browser__request`, `notion__query_database`).
|
||
|
||
See the updated MCP guide in `docs/` for detailed installation commands, environment variables, and health checks for each connector. The documentation set below walks through configuration and runtime toggles for `web_search` and the rest of the reference bundle.
|
||
|
||
## Security & Privacy
|
||
|
||
Owlen is designed to keep data local by default while still allowing controlled access to remote tooling.
|
||
|
||
- **Local-first execution**: All LLM calls flow through the bundled MCP LLM server which talks to a local Ollama instance. If the server is unreachable, Owlen stays usable in “offline mode” and surfaces clear recovery instructions.
|
||
- **Sandboxed tooling**: Code execution runs in Docker according to the MCP Code Server settings, and future releases will extend this to other OS-level sandboxes (`sandbox-exec` on macOS, Windows job objects).
|
||
- **Session storage**: Conversations are stored under the platform data directory and can be encrypted at rest. Set `privacy.encrypt_local_data = true` in `config.toml` to enable AES-GCM storage protected by a user-supplied passphrase.
|
||
- **Network access**: No telemetry is sent. The only outbound requests occur when you explicitly enable remote tooling (e.g., web search) or configure a cloud LLM provider. Each tool is opt-in via `privacy` and `tools` configuration sections.
|
||
- **Config migrations**: Every saved `config.toml` carries a schema version and is upgraded automatically; deprecated keys trigger warnings so security-related settings are not silently ignored.
|
||
|
||
## Getting Started
|
||
|
||
### Prerequisites
|
||
- Rust 1.75+ and Cargo.
|
||
- A running Ollama instance.
|
||
- A terminal that supports 256 colors.
|
||
|
||
### Installation
|
||
|
||
Pick the option that matches your platform and appetite for source builds:
|
||
|
||
| Platform | Package / Command | Notes |
|
||
| --- | --- | --- |
|
||
| Arch Linux | `yay -S owlen-git` | Builds from the latest `dev` branch via AUR. |
|
||
| Other Linux | `cargo install --path crates/owlen-cli --locked --force` | Requires Rust 1.75+ and a running Ollama daemon. |
|
||
| macOS | `cargo install --path crates/owlen-cli --locked --force` | macOS 12+ tested. Install Ollama separately (`brew install ollama`). The binary links against the system OpenSSL – ensure Command Line Tools are installed. |
|
||
| Windows (experimental) | `cargo install --path crates/owlen-cli --locked --force` | Enable the GNU toolchain (`rustup target add x86_64-pc-windows-gnu`) and install Ollama for Windows preview builds. Some optional tools (e.g., Docker-based code execution) are currently disabled. |
|
||
|
||
If you prefer containerised builds, use the provided `Dockerfile` as a base image and copy out `target/release/owlen`.
|
||
|
||
Run the helper scripts to sanity-check platform coverage:
|
||
|
||
```bash
|
||
# Windows compatibility smoke test (GNU toolchain)
|
||
scripts/check-windows.sh
|
||
|
||
# Reproduce CI packaging locally (choose a target from .woodpecker.yml)
|
||
dev/local_build.sh x86_64-unknown-linux-gnu
|
||
```
|
||
|
||
> **Tip (macOS):** On the first launch macOS Gatekeeper may quarantine the binary. Clear the attribute (`xattr -d com.apple.quarantine $(which owlen)`) or build from source locally to avoid notarisation prompts.
|
||
|
||
### Running OWLEN
|
||
|
||
Make sure Ollama is running, then launch the application:
|
||
```bash
|
||
owlen
|
||
```
|
||
If you built from source without installing, you can run it with:
|
||
```bash
|
||
./target/release/owlen
|
||
```
|
||
|
||
### Updating
|
||
|
||
Owlen does not auto-update. Run `owlen upgrade` at any time to print the recommended manual steps (pull the repository and reinstall with `cargo install --path crates/owlen-cli --force`). Arch Linux users can update via the `owlen-git` AUR package.
|
||
|
||
## Using the TUI
|
||
|
||
OWLEN uses a modal, vim-inspired interface. Press `F1` (available from any mode) or `?` in Normal mode to view the help screen with all keybindings.
|
||
|
||
- **Normal Mode**: Navigate with `h/j/k/l`, `w/b`, `gg/G`.
|
||
- **Editing Mode**: Enter with `i` or `a`. Send messages with `Enter`.
|
||
- **Command Mode**: Enter with `:`. Access commands like `:quit`, `:w`, `:session save`, `:theme`.
|
||
- **Quick Exit**: Press `Ctrl+C` twice in Normal mode to quit quickly (first press still cancels active generations).
|
||
- **Tutorial Command**: Type `:tutorial` any time for a quick summary of the most important keybindings.
|
||
- **MCP Slash Commands**: Owlen auto-registers zero-argument MCP tools as slash commands—type `/mcp__github__list_prs` (for example) to pull remote context directly into the chat log.
|
||
|
||
### Keymaps
|
||
|
||
Two built-in keymaps ship with Owlen:
|
||
|
||
- `vim` (default) – the existing modal bindings documented above.
|
||
- `emacs` – bindings centred around `Alt+X`, `Ctrl+Space`, and `Alt+O` shortcuts with Emacs-style submit (`Ctrl+Enter`).
|
||
|
||
Switch at runtime with `:keymap vim` or `:keymap emacs`. Persist your choice by setting `ui.keymap_profile = "emacs"` (or `"vim"`) in `config.toml`. If you prefer a fully custom layout, point `ui.keymap_path` at a TOML file using the same format as [`crates/owlen-tui/keymap.toml`](crates/owlen-tui/keymap.toml); the new emacs profile file [`crates/owlen-tui/keymap_emacs.toml`](crates/owlen-tui/keymap_emacs.toml) is a useful template.
|
||
|
||
Model discovery commands worth remembering:
|
||
|
||
- `:models --local` or `:models --cloud` jump directly to the corresponding section in the picker.
|
||
- `:cloud setup [--force-cloud-base-url]` stores your cloud API key without clobbering an existing local base URL (unless you opt in with the flag).
|
||
- `:limits` prints the locally tracked hourly/weekly token totals for each provider and mirrors the values shown in the chat header.
|
||
When a catalogue is unreachable, Owlen now tags the picker with `Local unavailable` / `Cloud unavailable` so you can recover without guessing.
|
||
|
||
## Documentation
|
||
|
||
For more detailed information, please refer to the following documents:
|
||
|
||
- **[CONTRIBUTING.md](CONTRIBUTING.md)**: Guidelines for contributing to the project.
|
||
- **[CHANGELOG.md](CHANGELOG.md)**: A log of changes for each version.
|
||
- **[docs/architecture.md](docs/architecture.md)**: An overview of the project's architecture.
|
||
- **[docs/troubleshooting.md](docs/troubleshooting.md)**: Help with common issues.
|
||
- **[docs/repo-map.md](docs/repo-map.md)**: Snapshot of the workspace layout and key crates.
|
||
- **[docs/provider-implementation.md](docs/provider-implementation.md)**: Trait-level details for implementing providers.
|
||
- **[docs/adding-providers.md](docs/adding-providers.md)**: Step-by-step checklist for wiring a provider into the multi-provider architecture and test suite.
|
||
- **[docs/tui-ux-playbook.md](docs/tui-ux-playbook.md)**: Design principles, modal ergonomics, and keybinding guidance for the TUI.
|
||
- **Experimental providers staging area**: [crates/providers/experimental/README.md](crates/providers/experimental/README.md) records the placeholder crates (OpenAI, Anthropic, Gemini) and their current status.
|
||
- **[docs/platform-support.md](docs/platform-support.md)**: Current OS support matrix and cross-check instructions.
|
||
|
||
## Developer Tasks
|
||
|
||
- `cargo xtask screenshots` regenerates deterministic ANSI dumps (and, when
|
||
`chafa` is available, PNG renders) for the documentation gallery. Use
|
||
`--no-png` to skip the PNG step or `--output <dir>` to redirect the output.
|
||
|
||
## Configuration
|
||
|
||
OWLEN stores its configuration in the standard platform-specific config directory:
|
||
|
||
| Platform | Location |
|
||
|----------|----------|
|
||
| Linux | `~/.config/owlen/config.toml` |
|
||
| macOS | `~/Library/Application Support/owlen/config.toml` |
|
||
| Windows | `%APPDATA%\owlen\config.toml` |
|
||
|
||
Use `owlen config init` to scaffold a fresh configuration (pass `--force` to overwrite an existing file), `owlen config path` to print the resolved location, and `owlen config doctor` to migrate legacy layouts automatically.
|
||
You can also add custom themes alongside the config directory (e.g., `~/.config/owlen/themes/`).
|
||
|
||
See the [themes/README.md](themes/README.md) for more details on theming.
|
||
|
||
## Testing
|
||
|
||
Owlen uses standard Rust tooling for verification. Run the full test suite with:
|
||
|
||
```bash
|
||
cargo test
|
||
```
|
||
|
||
Unit tests cover the command palette state machine, agent response parsing, and key MCP abstractions. Formatting and lint checks can be run with `cargo fmt --all` and `cargo clippy` respectively.
|
||
|
||
## Roadmap
|
||
|
||
Upcoming milestones focus on feature parity with modern code assistants while keeping Owlen local-first:
|
||
|
||
1. **Phase 11 – MCP client enhancements**: `owlen mcp add/list/remove`, resource references (`@github:issue://123`), and MCP prompt slash commands.
|
||
2. **Phase 12 – Approval & sandboxing**: Three-tier approval modes plus platform-specific sandboxes (Docker, `sandbox-exec`, Windows job objects).
|
||
3. **Phase 13 – Project documentation system**: Automatic `OWLEN.md` generation, contextual updates, and nested project support.
|
||
4. **Phase 15 – Provider expansion**: OpenAI, Anthropic, and other cloud providers layered onto the existing Ollama-first architecture.
|
||
|
||
See `AGENTS.md` for the long-form roadmap and design notes.
|
||
|
||
## Contributing
|
||
|
||
Contributions are highly welcome! Please see our **[Contributing Guide](CONTRIBUTING.md)** for details on how to get started, including our code style, commit conventions, and pull request process.
|
||
|
||
## License
|
||
|
||
This project is licensed under the GNU Affero General Public License v3.0. See the [LICENSE](LICENSE) file for details.
|
||
For commercial or proprietary integrations that cannot adopt AGPL, please reach out to the maintainers to discuss alternative licensing arrangements.
|