279 lines
8.7 KiB
Markdown
279 lines
8.7 KiB
Markdown
# OWLEN
|
|
|
|
> Terminal-native assistant for running local language models with a comfortable TUI.
|
|
|
|

|
|

|
|

|
|

|
|
|
|
## Alpha Status
|
|
|
|
- This project is currently in **alpha** (v0.1.0) and under active development.
|
|
- Core features are functional but expect occasional bugs and missing polish.
|
|
- Breaking changes may occur between releases as we refine the API.
|
|
- Feedback, bug reports, and contributions are very welcome!
|
|
|
|
## What Is OWLEN?
|
|
|
|
OWLEN is a Rust-powered, terminal-first interface for interacting with local large
|
|
language models. It provides a responsive chat workflow that runs against
|
|
[Ollama](https://ollama.com/) with a focus on developer productivity, vim-style navigation,
|
|
and seamless session management—all without leaving your terminal.
|
|
|
|
## Screenshots
|
|
|
|
### Initial Layout
|
|

|
|
|
|
The OWLEN interface features a clean, multi-panel layout with vim-inspired navigation. See more screenshots in the [`images/`](images/) directory including:
|
|
- Full chat conversations (`chat_view.png`)
|
|
- Help menu (`help.png`)
|
|
- Model selection (`model_select.png`)
|
|
- Visual selection mode (`select_mode.png`)
|
|
|
|
## Features
|
|
|
|
### Chat Client (`owlen`)
|
|
- **Vim-style Navigation** - Normal, editing, visual, and command modes
|
|
- **Streaming Responses** - Real-time token streaming from Ollama
|
|
- **Multi-Panel Interface** - Separate panels for chat, thinking content, and input
|
|
- **Advanced Text Editing** - Multi-line input with `tui-textarea`, history navigation
|
|
- **Visual Selection & Clipboard** - Yank/paste text across panels
|
|
- **Flexible Scrolling** - Half-page, full-page, and cursor-based navigation
|
|
- **Model Management** - Interactive model and provider selection (press `m`)
|
|
- **Session Management** - Start new conversations, clear history
|
|
- **Thinking Mode Support** - Dedicated panel for extended reasoning content
|
|
- **Bracketed Paste** - Safe paste handling for multi-line content
|
|
|
|
### Code Client (`owlen-code`) [Experimental]
|
|
- All chat client features
|
|
- Optimized system prompt for programming assistance
|
|
- Foundation for future code-specific features
|
|
|
|
### Core Infrastructure
|
|
- **Modular Architecture** - Separated core logic, TUI components, and providers
|
|
- **Provider System** - Extensible provider trait (currently: Ollama)
|
|
- **Session Controller** - Unified conversation and state management
|
|
- **Configuration Management** - TOML-based config with sensible defaults
|
|
- **Message Formatting** - Markdown rendering, thinking content extraction
|
|
- **Async Runtime** - Built on Tokio for efficient streaming
|
|
|
|
## Getting Started
|
|
|
|
### Prerequisites
|
|
- Rust 1.75+ and Cargo (`rustup` recommended)
|
|
- A running Ollama instance with at least one model pulled
|
|
(defaults to `http://localhost:11434`)
|
|
- A terminal that supports 256 colors
|
|
|
|
### Clone and Build
|
|
|
|
```bash
|
|
git clone https://somegit.dev/Owlibou/owlen.git
|
|
cd owlen
|
|
cargo build --release
|
|
```
|
|
|
|
### Run the Chat Client
|
|
|
|
Make sure Ollama is running, then launch:
|
|
|
|
```bash
|
|
./target/release/owlen
|
|
# or during development:
|
|
cargo run --bin owlen
|
|
```
|
|
|
|
### (Optional) Try the Code Client
|
|
|
|
The coding-focused TUI is experimental:
|
|
|
|
```bash
|
|
cargo build --release --bin owlen-code --features code-client
|
|
./target/release/owlen-code
|
|
```
|
|
|
|
## Using the TUI
|
|
|
|
### Mode System (Vim-inspired)
|
|
|
|
**Normal Mode** (default):
|
|
- `i` / `Enter` - Enter editing mode
|
|
- `a` - Append (move right and enter editing mode)
|
|
- `A` - Append at end of line
|
|
- `I` - Insert at start of line
|
|
- `o` - Insert new line below
|
|
- `O` - Insert new line above
|
|
- `v` - Enter visual mode (text selection)
|
|
- `:` - Enter command mode
|
|
- `h/j/k/l` - Navigate left/down/up/right
|
|
- `w/b/e` - Word navigation
|
|
- `0/$` - Jump to line start/end
|
|
- `gg` - Jump to top
|
|
- `G` - Jump to bottom
|
|
- `Ctrl-d/u` - Half-page scroll
|
|
- `Ctrl-f/b` - Full-page scroll
|
|
- `Tab` - Cycle focus between panels
|
|
- `p` - Paste from clipboard
|
|
- `dd` - Clear input buffer
|
|
- `q` - Quit
|
|
|
|
**Editing Mode**:
|
|
- `Esc` - Return to normal mode
|
|
- `Enter` - Send message and return to normal mode
|
|
- `Ctrl-J` / `Shift-Enter` - Insert newline
|
|
- `Ctrl-↑/↓` - Navigate input history
|
|
- Paste events handled automatically
|
|
|
|
**Visual Mode**:
|
|
- `j/k/h/l` - Extend selection
|
|
- `w/b/e` - Word-based selection
|
|
- `y` - Yank (copy) selection
|
|
- `d` - Cut selection (Input panel only)
|
|
- `Esc` - Cancel selection
|
|
|
|
**Command Mode**:
|
|
- `:q` / `:quit` - Quit application
|
|
- `:c` / `:clear` - Clear conversation
|
|
- `:m` / `:model` - Open model selector
|
|
- `:n` / `:new` - Start new conversation
|
|
- `:h` / `:help` - Show help
|
|
|
|
### Panel Management
|
|
- Three panels: Chat, Thinking, and Input
|
|
- `Tab` / `Shift-Tab` - Cycle focus forward/backward
|
|
- Focused panel receives scroll and navigation commands
|
|
- Thinking panel appears when extended reasoning is available
|
|
|
|
## Configuration
|
|
|
|
OWLEN stores configuration in `~/.config/owlen/config.toml`. The file is created
|
|
on first run and can be edited to customize behavior:
|
|
|
|
```toml
|
|
[general]
|
|
default_model = "llama3.2:latest"
|
|
default_provider = "ollama"
|
|
enable_streaming = true
|
|
project_context_file = "OWLEN.md"
|
|
|
|
[providers.ollama]
|
|
provider_type = "ollama"
|
|
base_url = "http://localhost:11434"
|
|
timeout = 300
|
|
```
|
|
|
|
Configuration is automatically saved when you change models or providers.
|
|
|
|
## Repository Layout
|
|
|
|
```
|
|
owlen/
|
|
├── crates/
|
|
│ ├── owlen-core/ # Core types, session management, shared UI components
|
|
│ ├── owlen-ollama/ # Ollama provider implementation
|
|
│ ├── owlen-tui/ # TUI components (chat_app, code_app, rendering)
|
|
│ └── owlen-cli/ # Binary entry points (owlen, owlen-code)
|
|
├── LICENSE # MIT License
|
|
├── Cargo.toml # Workspace configuration
|
|
└── README.md
|
|
```
|
|
|
|
### Architecture Highlights
|
|
- **owlen-core**: Provider-agnostic core with session controller, UI primitives (AutoScroll, InputMode, FocusedPanel), and shared utilities
|
|
- **owlen-tui**: Ratatui-based UI implementation with vim-style modal editing
|
|
- **Separation of Concerns**: Clean boundaries between business logic, presentation, and provider implementations
|
|
|
|
## Development
|
|
|
|
### Building
|
|
```bash
|
|
# Debug build
|
|
cargo build
|
|
|
|
# Release build
|
|
cargo build --release
|
|
|
|
# Build with all features
|
|
cargo build --all-features
|
|
|
|
# Run tests
|
|
cargo test
|
|
|
|
# Check code
|
|
cargo clippy
|
|
cargo fmt
|
|
```
|
|
|
|
### Development Notes
|
|
- Standard Rust workflows apply (`cargo fmt`, `cargo clippy`, `cargo test`)
|
|
- Codebase uses async Rust (`tokio`) for event handling and streaming
|
|
- Configuration is cached in `~/.config/owlen` (wipe to reset)
|
|
- UI components are extensively tested in `owlen-core/src/ui.rs`
|
|
|
|
## Roadmap
|
|
|
|
### Completed ✓
|
|
- [x] Streaming responses with real-time display
|
|
- [x] Autoscroll and viewport management
|
|
- [x] Push user message before loading LLM response
|
|
- [x] Thinking mode support with dedicated panel
|
|
- [x] Vim-style modal editing (Normal, Visual, Command modes)
|
|
- [x] Multi-panel focus management
|
|
- [x] Text selection and clipboard functionality
|
|
- [x] Comprehensive keyboard navigation
|
|
- [x] Bracketed paste support
|
|
|
|
### In Progress
|
|
- [ ] Theming options and color customization
|
|
- [ ] Enhanced configuration UX (in-app settings)
|
|
- [ ] Chat history management (save/load/export)
|
|
|
|
### Planned
|
|
- [ ] Code Client Enhancement
|
|
- [ ] In-project code navigation
|
|
- [ ] Syntax highlighting for code blocks
|
|
- [ ] File tree browser integration
|
|
- [ ] Project-aware context management
|
|
- [ ] Code snippets and templates
|
|
- [ ] Additional LLM Providers
|
|
- [ ] OpenAI API support
|
|
- [ ] Anthropic Claude support
|
|
- [ ] Local model providers (llama.cpp, etc.)
|
|
- [ ] Advanced Features
|
|
- [ ] Conversation search and filtering
|
|
- [ ] Multi-session management
|
|
- [ ] Export conversations (Markdown, JSON)
|
|
- [ ] Custom keybindings
|
|
- [ ] Plugin system
|
|
|
|
## Contributing
|
|
|
|
Contributions are welcome! Here's how to get started:
|
|
|
|
1. Fork the repository
|
|
2. Create a feature branch (`git checkout -b feature/amazing-feature`)
|
|
3. Make your changes and add tests
|
|
4. Run `cargo fmt` and `cargo clippy`
|
|
5. Commit your changes (`git commit -m 'Add amazing feature'`)
|
|
6. Push to the branch (`git push origin feature/amazing-feature`)
|
|
7. Open a Pull Request
|
|
|
|
Please open an issue first for significant changes to discuss the approach.
|
|
|
|
## License
|
|
|
|
This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.
|
|
|
|
## Acknowledgments
|
|
|
|
Built with:
|
|
- [ratatui](https://ratatui.rs/) - Terminal UI framework
|
|
- [crossterm](https://github.com/crossterm-rs/crossterm) - Cross-platform terminal manipulation
|
|
- [tokio](https://tokio.rs/) - Async runtime
|
|
- [Ollama](https://ollama.com/) - Local LLM runtime
|
|
|
|
---
|
|
|
|
**Status**: Alpha v0.1.0 | **License**: MIT | **Made with Rust** 🦀 |