Vessel
A modern, feature-rich web interface for Ollama
Features • Quick Start • Documentation • Contributing
Why Vessel
Vessel is intentionally focused on:
- A clean, local-first UI for Ollama
- Minimal configuration
- Low visual and cognitive overhead
- Doing a small set of things well
If you want a universal, highly configurable platform → open-webui is a great choice. If you want a small, focused UI for local Ollama usage → Vessel is built for that.
Features
Chat
- Real-time streaming responses with token metrics
- Message branching — edit any message to create alternative conversation paths
- Markdown rendering with syntax highlighting
- Thinking mode — native support for reasoning models (DeepSeek-R1, etc.)
- Dark/Light themes
Projects & Organization
- Projects — group related conversations together
- Pin and archive conversations
- Smart title generation from conversation content
- Global search — semantic, title, and content search across all chats
Knowledge Base (RAG)
- Upload documents (text, markdown, PDF) to build a knowledge base
- Semantic search using embeddings for context-aware retrieval
- Project-specific or global knowledge bases
- Automatic context injection into conversations
Tools
- 5 built-in tools: web search, URL fetching, calculator, location, time
- Custom tools: Create your own in JavaScript, Python, or HTTP
- Agentic tool calling with chain-of-thought reasoning
- Test tools before saving with the built-in testing panel
Models
- Browse and pull models from ollama.com
- Create custom models with embedded system prompts
- Per-model parameters — customize temperature, context size, top_k/top_p
- Track model updates and capability detection (vision, tools, code)
Prompts
- Save and organize system prompts
- Assign default prompts to specific models
- Capability-based auto-selection (vision, code, tools, thinking)
📖 Full documentation on the Wiki →
Screenshots
Clean chat interface |
Syntax-highlighted code |
Integrated web search |
Model browser |
Quick Start
Prerequisites
Configure Ollama
Ollama must listen on all interfaces for Docker to connect:
# Option A: systemd (Linux)
sudo systemctl edit ollama
# Add: Environment="OLLAMA_HOST=0.0.0.0"
sudo systemctl restart ollama
# Option B: Manual
OLLAMA_HOST=0.0.0.0 ollama serve
Install
# One-line install
curl -fsSL https://somegit.dev/vikingowl/vessel/raw/main/install.sh | bash
# Or clone and run
git clone https://github.com/VikingOwl91/vessel.git
cd vessel
./install.sh
Open http://localhost:7842 in your browser.
Update / Uninstall
./install.sh --update # Update to latest
./install.sh --uninstall # Remove
📖 Detailed installation guide →
Documentation
Full documentation is available on the GitHub Wiki:
| Guide | Description |
|---|---|
| Getting Started | Installation and configuration |
| Projects | Organize conversations into projects |
| Knowledge Base | RAG with document upload and semantic search |
| Search | Semantic and content search across chats |
| Custom Tools | Create JavaScript, Python, or HTTP tools |
| System Prompts | Manage prompts with model defaults |
| Custom Models | Create models with embedded prompts |
| Built-in Tools | Reference for web search, calculator, etc. |
| API Reference | Backend endpoints |
| Development | Contributing and architecture |
| Troubleshooting | Common issues and solutions |
Roadmap
Vessel prioritizes usability and simplicity over feature breadth.
Completed:
- Model browser with filtering and update detection
- Custom tools (JavaScript, Python, HTTP)
- System prompt library with model-specific defaults
- Custom model creation with embedded prompts
- Projects for conversation organization
- Knowledge base with RAG (semantic retrieval)
- Global search (semantic, title, content)
- Thinking mode for reasoning models
- Message branching and conversation trees
Planned:
- Keyboard-first workflows
- UX polish and stability improvements
- Optional voice input/output
Non-Goals:
- Multi-user systems
- Cloud sync
- Plugin ecosystems
- Support for every LLM runtime
Do one thing well. Keep the UI out of the way.
Contributing
Contributions are welcome!
- Fork the repository
- Create your feature branch (
git checkout -b feature/amazing-feature) - Commit your changes
- Push and open a Pull Request
Issues: github.com/VikingOwl91/vessel/issues
License
GPL-3.0 — See LICENSE for details.



