When the model calls recall without key or category (e.g., 'what memories do you have?'), it now returns all memories across all categories instead of an error. This provides better UX since models often use recall instead of list for memory queries.
Vessel
A modern, feature-rich web interface for Ollama
Features • Quick Start • Documentation • Contributing
Why Vessel
Vessel is intentionally focused on:
- A clean, local-first UI for Ollama
- Minimal configuration
- Low visual and cognitive overhead
- Doing a small set of things well
If you want a universal, highly configurable platform → open-webui is a great choice. If you want a small, focused UI for local Ollama usage → Vessel is built for that.
Features
Chat
- Real-time streaming responses
- Message editing with branch navigation
- Markdown rendering with syntax highlighting
- Dark/Light themes
Tools
- 5 built-in tools: web search, URL fetching, calculator, location, time
- Custom tools: Create your own in JavaScript, Python, or HTTP
- Test tools before saving with the built-in testing panel
Models
- Browse and pull models from ollama.com
- Create custom models with embedded system prompts
- Track model updates
Prompts
- Save and organize system prompts
- Assign default prompts to specific models
- Capability-based auto-selection (vision, code, tools, thinking)
📖 Full documentation on the Wiki →
Screenshots
Clean chat interface |
Syntax-highlighted code |
Integrated web search |
Model browser |
Quick Start
Prerequisites
Configure Ollama
Ollama must listen on all interfaces for Docker to connect:
# Option A: systemd (Linux)
sudo systemctl edit ollama
# Add: Environment="OLLAMA_HOST=0.0.0.0"
sudo systemctl restart ollama
# Option B: Manual
OLLAMA_HOST=0.0.0.0 ollama serve
Install
# One-line install
curl -fsSL https://somegit.dev/vikingowl/vessel/raw/main/install.sh | bash
# Or clone and run
git clone https://github.com/VikingOwl91/vessel.git
cd vessel
./install.sh
Open http://localhost:7842 in your browser.
Update / Uninstall
./install.sh --update # Update to latest
./install.sh --uninstall # Remove
📖 Detailed installation guide →
Documentation
Full documentation is available on the GitHub Wiki:
| Guide | Description |
|---|---|
| Getting Started | Installation and configuration |
| Custom Tools | Create JavaScript, Python, or HTTP tools |
| System Prompts | Manage prompts with model defaults |
| Custom Models | Create models with embedded prompts |
| Built-in Tools | Reference for web search, calculator, etc. |
| API Reference | Backend endpoints |
| Development | Contributing and architecture |
| Troubleshooting | Common issues and solutions |
Roadmap
Vessel prioritizes usability and simplicity over feature breadth.
Completed:
- Model browser with filtering and update detection
- Custom tools (JavaScript, Python, HTTP)
- System prompt library with model-specific defaults
- Custom model creation with embedded prompts
Planned:
- Keyboard-first workflows
- UX polish and stability improvements
- Optional voice input/output
Non-Goals:
- Multi-user systems
- Cloud sync
- Plugin ecosystems
- Support for every LLM runtime
Do one thing well. Keep the UI out of the way.
Contributing
Contributions are welcome!
- Fork the repository
- Create your feature branch (
git checkout -b feature/amazing-feature) - Commit your changes
- Push and open a Pull Request
Issues: github.com/VikingOwl91/vessel/issues
License
GPL-3.0 — See LICENSE for details.



