Update README: refine content, add screenshots, improve structure, and document new features and usage details.
This commit is contained in:
294
README.md
294
README.md
@@ -2,126 +2,278 @@
|
||||
|
||||
> Terminal-native assistant for running local language models with a comfortable TUI.
|
||||
|
||||

|
||||

|
||||

|
||||

|
||||

|
||||

|
||||
|
||||
## Pre-Alpha Status
|
||||
- This project is currently **pre-alpha** and under active development.
|
||||
- Expect breaking changes, missing features, and occasional rough edges.
|
||||
- Feedback, bug reports, and ideas are very welcome while we shape the roadmap.
|
||||
## Alpha Status
|
||||
|
||||
- This project is currently in **alpha** (v0.1.0) and under active development.
|
||||
- Core features are functional but expect occasional bugs and missing polish.
|
||||
- Breaking changes may occur between releases as we refine the API.
|
||||
- Feedback, bug reports, and contributions are very welcome!
|
||||
|
||||
## What Is OWLEN?
|
||||
OWLEN is a Rust-powered, terminal-first interface for interacting with local large
|
||||
language models. It focuses on a responsive chat workflow that runs against
|
||||
[Ollama](https://ollama.com/) and surfaces the tools needed to manage sessions,
|
||||
inspect project context, and iterate quickly without leaving your shell.
|
||||
|
||||
### Current Highlights
|
||||
- Chat-first terminal UI built with `ratatui` and `crossterm`.
|
||||
- Out-of-the-box Ollama integration with streaming responses.
|
||||
- Persistent configuration, model caching, and session statistics.
|
||||
- Project-aware context loading (reads `OWLEN.md` when present).
|
||||
- Experimental coding assistant mode (opt-in build feature).
|
||||
OWLEN is a Rust-powered, terminal-first interface for interacting with local large
|
||||
language models. It provides a responsive chat workflow that runs against
|
||||
[Ollama](https://ollama.com/) with a focus on developer productivity, vim-style navigation,
|
||||
and seamless session management—all without leaving your terminal.
|
||||
|
||||
## Screenshots
|
||||
|
||||
### Initial Layout
|
||||

|
||||
|
||||
The OWLEN interface features a clean, multi-panel layout with vim-inspired navigation. See more screenshots in the [`images/`](images/) directory including:
|
||||
- Full chat conversations (`chat_view.png`)
|
||||
- Help menu (`help.png`)
|
||||
- Model selection (`model_select.png`)
|
||||
- Visual selection mode (`select_mode.png`)
|
||||
|
||||
## Features
|
||||
|
||||
### Chat Client (`owlen`)
|
||||
- **Vim-style Navigation** - Normal, editing, visual, and command modes
|
||||
- **Streaming Responses** - Real-time token streaming from Ollama
|
||||
- **Multi-Panel Interface** - Separate panels for chat, thinking content, and input
|
||||
- **Advanced Text Editing** - Multi-line input with `tui-textarea`, history navigation
|
||||
- **Visual Selection & Clipboard** - Yank/paste text across panels
|
||||
- **Flexible Scrolling** - Half-page, full-page, and cursor-based navigation
|
||||
- **Model Management** - Interactive model and provider selection (press `m`)
|
||||
- **Session Management** - Start new conversations, clear history
|
||||
- **Thinking Mode Support** - Dedicated panel for extended reasoning content
|
||||
- **Bracketed Paste** - Safe paste handling for multi-line content
|
||||
|
||||
### Code Client (`owlen-code`) [Experimental]
|
||||
- All chat client features
|
||||
- Optimized system prompt for programming assistance
|
||||
- Foundation for future code-specific features
|
||||
|
||||
### Core Infrastructure
|
||||
- **Modular Architecture** - Separated core logic, TUI components, and providers
|
||||
- **Provider System** - Extensible provider trait (currently: Ollama)
|
||||
- **Session Controller** - Unified conversation and state management
|
||||
- **Configuration Management** - TOML-based config with sensible defaults
|
||||
- **Message Formatting** - Markdown rendering, thinking content extraction
|
||||
- **Async Runtime** - Built on Tokio for efficient streaming
|
||||
|
||||
## Getting Started
|
||||
|
||||
### Prerequisites
|
||||
- Rust 1.75+ and Cargo (`rustup` recommended).
|
||||
- Rust 1.75+ and Cargo (`rustup` recommended)
|
||||
- A running Ollama instance with at least one model pulled
|
||||
(defaults to `http://localhost:11434`).
|
||||
- A terminal that supports 256 colours.
|
||||
(defaults to `http://localhost:11434`)
|
||||
- A terminal that supports 256 colors
|
||||
|
||||
### Clone and Build
|
||||
|
||||
```bash
|
||||
git clone https://github.com/Owlibou/owlen.git
|
||||
git clone https://somegit.dev/Owlibou/owlen.git
|
||||
cd owlen
|
||||
cargo build -p owlen-cli
|
||||
cargo build --release
|
||||
```
|
||||
|
||||
### Run the Chat Client
|
||||
|
||||
Make sure Ollama is running, then launch:
|
||||
|
||||
```bash
|
||||
cargo run -p owlen-cli --bin owlen
|
||||
./target/release/owlen
|
||||
# or during development:
|
||||
cargo run --bin owlen
|
||||
```
|
||||
|
||||
### (Optional) Try the Coding Client
|
||||
The coding-focused TUI is experimental and ships behind a feature flag:
|
||||
### (Optional) Try the Code Client
|
||||
|
||||
The coding-focused TUI is experimental:
|
||||
|
||||
```bash
|
||||
cargo run -p owlen-cli --features code-client --bin owlen-code
|
||||
cargo build --release --bin owlen-code --features code-client
|
||||
./target/release/owlen-code
|
||||
```
|
||||
|
||||
## Using the TUI
|
||||
- `i` / `Enter` – focus the input box.
|
||||
- `Enter` – send the current message.
|
||||
- `Shift+Enter` / `Ctrl+J` – insert a newline while editing.
|
||||
- `m` – open the model selector.
|
||||
- `n` – start a fresh conversation.
|
||||
- `c` – clear the current chat history.
|
||||
- `h` – open inline help.
|
||||
- `q` – quit.
|
||||
|
||||
The status line surfaces hints, error messages, and streaming progress.
|
||||
### Mode System (Vim-inspired)
|
||||
|
||||
**Normal Mode** (default):
|
||||
- `i` / `Enter` - Enter editing mode
|
||||
- `a` - Append (move right and enter editing mode)
|
||||
- `A` - Append at end of line
|
||||
- `I` - Insert at start of line
|
||||
- `o` - Insert new line below
|
||||
- `O` - Insert new line above
|
||||
- `v` - Enter visual mode (text selection)
|
||||
- `:` - Enter command mode
|
||||
- `h/j/k/l` - Navigate left/down/up/right
|
||||
- `w/b/e` - Word navigation
|
||||
- `0/$` - Jump to line start/end
|
||||
- `gg` - Jump to top
|
||||
- `G` - Jump to bottom
|
||||
- `Ctrl-d/u` - Half-page scroll
|
||||
- `Ctrl-f/b` - Full-page scroll
|
||||
- `Tab` - Cycle focus between panels
|
||||
- `p` - Paste from clipboard
|
||||
- `dd` - Clear input buffer
|
||||
- `q` - Quit
|
||||
|
||||
**Editing Mode**:
|
||||
- `Esc` - Return to normal mode
|
||||
- `Enter` - Send message and return to normal mode
|
||||
- `Ctrl-J` / `Shift-Enter` - Insert newline
|
||||
- `Ctrl-↑/↓` - Navigate input history
|
||||
- Paste events handled automatically
|
||||
|
||||
**Visual Mode**:
|
||||
- `j/k/h/l` - Extend selection
|
||||
- `w/b/e` - Word-based selection
|
||||
- `y` - Yank (copy) selection
|
||||
- `d` - Cut selection (Input panel only)
|
||||
- `Esc` - Cancel selection
|
||||
|
||||
**Command Mode**:
|
||||
- `:q` / `:quit` - Quit application
|
||||
- `:c` / `:clear` - Clear conversation
|
||||
- `:m` / `:model` - Open model selector
|
||||
- `:n` / `:new` - Start new conversation
|
||||
- `:h` / `:help` - Show help
|
||||
|
||||
### Panel Management
|
||||
- Three panels: Chat, Thinking, and Input
|
||||
- `Tab` / `Shift-Tab` - Cycle focus forward/backward
|
||||
- Focused panel receives scroll and navigation commands
|
||||
- Thinking panel appears when extended reasoning is available
|
||||
|
||||
## Configuration
|
||||
|
||||
OWLEN stores configuration in `~/.config/owlen/config.toml`. The file is created
|
||||
on first run and can be edited to customise behaviour:
|
||||
on first run and can be edited to customize behavior:
|
||||
|
||||
```toml
|
||||
[general]
|
||||
default_model = "llama3.2:latest"
|
||||
default_provider = "ollama"
|
||||
enable_streaming = true
|
||||
project_context_file = "OWLEN.md"
|
||||
|
||||
[providers.ollama]
|
||||
provider_type = "ollama"
|
||||
base_url = "http://localhost:11434"
|
||||
timeout = 300
|
||||
```
|
||||
|
||||
Additional sections cover UI preferences, file limits, and storage paths. Each
|
||||
client persists its latest selections back to this file on exit.
|
||||
Configuration is automatically saved when you change models or providers.
|
||||
|
||||
## Repository Layout
|
||||
- `crates/owlen-core` – shared types, configuration, and session orchestration.
|
||||
- `crates/owlen-ollama` – provider implementation that speaks to the Ollama API.
|
||||
- `crates/owlen-tui` – `ratatui`-based UI, helpers, and event handling.
|
||||
- `crates/owlen-cli` – binaries (`owlen`, `owlen-code`) wiring everything together.
|
||||
- `tests` – integration-style smoke tests.
|
||||
|
||||
## Development Notes
|
||||
- Standard Rust workflows apply (`cargo fmt`, `cargo clippy`, `cargo test`).
|
||||
- The codebase uses async Rust (`tokio`) for event handling and streaming.
|
||||
- Configuration and chat history are cached locally; wipe `~/.config/owlen` to reset.
|
||||
```
|
||||
owlen/
|
||||
├── crates/
|
||||
│ ├── owlen-core/ # Core types, session management, shared UI components
|
||||
│ ├── owlen-ollama/ # Ollama provider implementation
|
||||
│ ├── owlen-tui/ # TUI components (chat_app, code_app, rendering)
|
||||
│ └── owlen-cli/ # Binary entry points (owlen, owlen-code)
|
||||
├── LICENSE # MIT License
|
||||
├── Cargo.toml # Workspace configuration
|
||||
└── README.md
|
||||
```
|
||||
|
||||
### Architecture Highlights
|
||||
- **owlen-core**: Provider-agnostic core with session controller, UI primitives (AutoScroll, InputMode, FocusedPanel), and shared utilities
|
||||
- **owlen-tui**: Ratatui-based UI implementation with vim-style modal editing
|
||||
- **Separation of Concerns**: Clean boundaries between business logic, presentation, and provider implementations
|
||||
|
||||
## Development
|
||||
|
||||
### Building
|
||||
```bash
|
||||
# Debug build
|
||||
cargo build
|
||||
|
||||
# Release build
|
||||
cargo build --release
|
||||
|
||||
# Build with all features
|
||||
cargo build --all-features
|
||||
|
||||
# Run tests
|
||||
cargo test
|
||||
|
||||
# Check code
|
||||
cargo clippy
|
||||
cargo fmt
|
||||
```
|
||||
|
||||
### Development Notes
|
||||
- Standard Rust workflows apply (`cargo fmt`, `cargo clippy`, `cargo test`)
|
||||
- Codebase uses async Rust (`tokio`) for event handling and streaming
|
||||
- Configuration is cached in `~/.config/owlen` (wipe to reset)
|
||||
- UI components are extensively tested in `owlen-core/src/ui.rs`
|
||||
|
||||
## Roadmap
|
||||
- [x] Add autoscroll.
|
||||
- [x] Push user message before loading the LLM response.
|
||||
- [ ] Add support for "thinking" models.
|
||||
- [ ] Add theming options.
|
||||
- [ ] Provide proper configuration UX.
|
||||
- [ ] Add chat-management tooling.
|
||||
- [ ] Reactivate and polish the coding client.
|
||||
- [ ] Add support for streaming responses.
|
||||
- [ ] Add support for streaming chat history.
|
||||
- [ ] Add support for streaming model statistics.
|
||||
- [ ] Add coding client
|
||||
- [ ] Add support for in project code navigation.
|
||||
- [ ] Add support for code completion.
|
||||
- [ ] Add support for code formatting.
|
||||
- [ ] Add support for code linting.
|
||||
- [ ] Add support for code refactoring.
|
||||
- [ ] Add support for code navigation.
|
||||
- [ ] Add support for code snippets.
|
||||
- [ ] Add support for in project config folder.
|
||||
- [ ] Add support for more local LLM providers.
|
||||
- [ ] Add support for cloud LLM providers.
|
||||
|
||||
### Completed ✓
|
||||
- [x] Streaming responses with real-time display
|
||||
- [x] Autoscroll and viewport management
|
||||
- [x] Push user message before loading LLM response
|
||||
- [x] Thinking mode support with dedicated panel
|
||||
- [x] Vim-style modal editing (Normal, Visual, Command modes)
|
||||
- [x] Multi-panel focus management
|
||||
- [x] Text selection and clipboard functionality
|
||||
- [x] Comprehensive keyboard navigation
|
||||
- [x] Bracketed paste support
|
||||
|
||||
### In Progress
|
||||
- [ ] Theming options and color customization
|
||||
- [ ] Enhanced configuration UX (in-app settings)
|
||||
- [ ] Chat history management (save/load/export)
|
||||
|
||||
### Planned
|
||||
- [ ] Code Client Enhancement
|
||||
- [ ] In-project code navigation
|
||||
- [ ] Syntax highlighting for code blocks
|
||||
- [ ] File tree browser integration
|
||||
- [ ] Project-aware context management
|
||||
- [ ] Code snippets and templates
|
||||
- [ ] Additional LLM Providers
|
||||
- [ ] OpenAI API support
|
||||
- [ ] Anthropic Claude support
|
||||
- [ ] Local model providers (llama.cpp, etc.)
|
||||
- [ ] Advanced Features
|
||||
- [ ] Conversation search and filtering
|
||||
- [ ] Multi-session management
|
||||
- [ ] Export conversations (Markdown, JSON)
|
||||
- [ ] Custom keybindings
|
||||
- [ ] Plugin system
|
||||
|
||||
## Contributing
|
||||
Contributions are encouraged, but expect a moving target while we stabilise the
|
||||
core experience. Opening an issue before a sizeable change helps coordinate the
|
||||
roadmap.
|
||||
|
||||
Contributions are welcome! Here's how to get started:
|
||||
|
||||
1. Fork the repository
|
||||
2. Create a feature branch (`git checkout -b feature/amazing-feature`)
|
||||
3. Make your changes and add tests
|
||||
4. Run `cargo fmt` and `cargo clippy`
|
||||
5. Commit your changes (`git commit -m 'Add amazing feature'`)
|
||||
6. Push to the branch (`git push origin feature/amazing-feature`)
|
||||
7. Open a Pull Request
|
||||
|
||||
Please open an issue first for significant changes to discuss the approach.
|
||||
|
||||
## License
|
||||
License terms are still being finalised for the pre-alpha release.
|
||||
|
||||
This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.
|
||||
|
||||
## Acknowledgments
|
||||
|
||||
Built with:
|
||||
- [ratatui](https://ratatui.rs/) - Terminal UI framework
|
||||
- [crossterm](https://github.com/crossterm-rs/crossterm) - Cross-platform terminal manipulation
|
||||
- [tokio](https://tokio.rs/) - Async runtime
|
||||
- [Ollama](https://ollama.com/) - Local LLM runtime
|
||||
|
||||
---
|
||||
|
||||
**Status**: Alpha v0.1.0 | **License**: MIT | **Made with Rust** 🦀
|
||||
BIN
images/chat_view.png
Normal file
BIN
images/chat_view.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 50 KiB |
BIN
images/help.png
Normal file
BIN
images/help.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 103 KiB |
BIN
images/layout.png
Normal file
BIN
images/layout.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 26 KiB |
BIN
images/model_select.png
Normal file
BIN
images/model_select.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 68 KiB |
BIN
images/select_mode.png
Normal file
BIN
images/select_mode.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 51 KiB |
Reference in New Issue
Block a user