🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Vessel
A modern, feature-rich web interface for Ollama
Features • Screenshots • Quick Start • Installation • Configuration • Development
Features
Core Chat Experience
- Real-time streaming — Watch responses appear token by token
- Conversation history — All chats stored locally in IndexedDB
- Message editing — Edit any message and regenerate responses with branching
- Branch navigation — Explore different response paths from edited messages
- Markdown rendering — Full GFM support with tables, lists, and formatting
- Syntax highlighting — Beautiful code blocks powered by Shiki with 100+ languages
- Dark/Light mode — Seamless theme switching with system preference detection
Built-in Tools (Function Calling)
Vessel includes five powerful tools that models can invoke automatically:
| Tool | Description |
|---|---|
| Web Search | Search the internet for current information, news, weather, prices |
| Fetch URL | Read and extract content from any webpage |
| Calculator | Safe math expression parser with functions (sqrt, sin, cos, log, etc.) |
| Get Location | Detect user location via GPS or IP for local queries |
| Get Time | Current date/time with timezone support |
Model Management
- Model browser — Browse, search, and pull models from Ollama registry
- Live status — See which models are currently loaded in memory
- Quick switch — Change models mid-conversation
- Model metadata — View parameters, quantization, and capabilities
Developer Experience
- Beautiful code generation — Syntax-highlighted output for any language
- Copy code blocks — One-click copy with visual feedback
- Scroll to bottom — Smart auto-scroll with manual override
- Keyboard shortcuts — Navigate efficiently with hotkeys
Screenshots
Clean, modern chat interface |
Syntax-highlighted code output |
Integrated web search with styled results |
Light theme for daytime use |
Browse and manage Ollama models |
|
Quick Start
The fastest way to get running with Docker Compose:
# Clone the repository
git clone https://github.com/yourusername/vessel.git
cd vessel
# Start all services (frontend, backend, ollama)
docker compose up -d
# Open in browser
open http://localhost:7842
This starts:
- Frontend on
http://localhost:7842 - Backend API on
http://localhost:9090 - Ollama on
http://localhost:11434
First Model
Pull your first model from the UI or via command line:
# Via Ollama CLI
docker compose exec ollama ollama pull llama3.2
# Or use the Model Browser in the UI
Installation
Option 1: Docker Compose (Recommended)
docker compose up -d
With GPU Support (NVIDIA)
Uncomment the GPU section in docker-compose.yml:
ollama:
image: ollama/ollama:latest
deploy:
resources:
reservations:
devices:
- driver: nvidia
count: all
capabilities: [gpu]
Then run:
docker compose up -d
Option 2: Manual Setup
Prerequisites
Frontend
cd frontend
npm install
npm run dev
Frontend runs on http://localhost:5173
Backend
cd backend
go mod tidy
go run cmd/server/main.go -port 9090
Backend API runs on http://localhost:9090
Configuration
Environment Variables
Frontend
| Variable | Default | Description |
|---|---|---|
OLLAMA_API_URL |
http://localhost:11434 |
Ollama API endpoint |
BACKEND_URL |
http://localhost:9090 |
Vessel backend API |
Backend
| Variable | Default | Description |
|---|---|---|
OLLAMA_URL |
http://localhost:11434 |
Ollama API endpoint |
PORT |
8080 |
Backend server port |
GIN_MODE |
debug |
Gin mode (debug, release) |
Docker Compose Override
Create docker-compose.override.yml for local customizations:
services:
frontend:
environment:
- CUSTOM_VAR=value
ports:
- "3000:3000" # Different port
Architecture
vessel/
├── frontend/ # SvelteKit 5 application
│ ├── src/
│ │ ├── lib/
│ │ │ ├── components/ # UI components
│ │ │ ├── stores/ # Svelte 5 runes state
│ │ │ ├── tools/ # Built-in tool definitions
│ │ │ ├── storage/ # IndexedDB (Dexie)
│ │ │ └── api/ # API clients
│ │ └── routes/ # SvelteKit routes
│ └── Dockerfile
│
├── backend/ # Go API server
│ ├── cmd/server/ # Entry point
│ └── internal/
│ ├── api/ # HTTP handlers
│ │ ├── fetcher.go # URL fetching with wget/curl/chromedp
│ │ ├── search.go # Web search via DuckDuckGo
│ │ └── routes.go # Route definitions
│ ├── database/ # SQLite storage
│ └── models/ # Data models
│
├── docker-compose.yml # Production setup
└── docker-compose.dev.yml # Development with hot reload
Tech Stack
Frontend
- SvelteKit 5 — Full-stack framework
- Svelte 5 — Runes-based reactivity
- TypeScript — Type safety
- Tailwind CSS — Utility-first styling
- Skeleton UI — Component library
- Shiki — Syntax highlighting
- Dexie — IndexedDB wrapper
- Marked — Markdown parser
- DOMPurify — XSS sanitization
Backend
- Go 1.24 — Fast, compiled backend
- Gin — HTTP framework
- SQLite — Embedded database
- chromedp — Headless browser
Development
Running Tests
# Frontend unit tests
cd frontend
npm run test
# With coverage
npm run test:coverage
# Watch mode
npm run test:watch
Type Checking
cd frontend
npm run check
Development Mode
Use the dev compose file for hot reloading:
docker compose -f docker-compose.dev.yml up
API Reference
Backend Endpoints
| Method | Endpoint | Description |
|---|---|---|
POST |
/api/v1/proxy/search |
Web search via DuckDuckGo |
POST |
/api/v1/proxy/fetch |
Fetch URL content |
GET |
/api/v1/location |
Get user location from IP |
GET |
/api/v1/models/registry |
Browse Ollama model registry |
GET |
/api/v1/models/search |
Search models |
POST |
/api/v1/chats/sync |
Sync conversations |
Ollama Proxy
All requests to /ollama/* are proxied to the Ollama API, enabling CORS.
Roadmap
- Image generation (Stable Diffusion, Hugging Face models)
- Hugging Face integration
- Voice input/output
- Multi-user support
- Plugin system
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
- Fork the repository
- Create your feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add some amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
License
Copyright (C) 2026 VikingOwl
This project is licensed under the GNU General Public License v3.0 - see the LICENSE file for details.




