Simplify setup by requiring local Ollama installation: - docker-compose.yml now connects to host Ollama via host.docker.internal - Remove ollama service from compose (no longer included) - install.sh now requires Ollama to be installed - Update README with clear prerequisites - Add Docker Ollama support to roadmap for future 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Vessel
A modern, feature-rich web interface for Ollama
Features • Screenshots • Quick Start • Installation • Configuration • Development
Features
Core Chat Experience
- Real-time streaming — Watch responses appear token by token
- Conversation history — All chats stored locally in IndexedDB
- Message editing — Edit any message and regenerate responses with branching
- Branch navigation — Explore different response paths from edited messages
- Markdown rendering — Full GFM support with tables, lists, and formatting
- Syntax highlighting — Beautiful code blocks powered by Shiki with 100+ languages
- Dark/Light mode — Seamless theme switching with system preference detection
Built-in Tools (Function Calling)
Vessel includes five powerful tools that models can invoke automatically:
| Tool | Description |
|---|---|
| Web Search | Search the internet for current information, news, weather, prices |
| Fetch URL | Read and extract content from any webpage |
| Calculator | Safe math expression parser with functions (sqrt, sin, cos, log, etc.) |
| Get Location | Detect user location via GPS or IP for local queries |
| Get Time | Current date/time with timezone support |
Model Management
- Model browser — Browse, search, and pull models from Ollama registry
- Live status — See which models are currently loaded in memory
- Quick switch — Change models mid-conversation
- Model metadata — View parameters, quantization, and capabilities
Developer Experience
- Beautiful code generation — Syntax-highlighted output for any language
- Copy code blocks — One-click copy with visual feedback
- Scroll to bottom — Smart auto-scroll with manual override
- Keyboard shortcuts — Navigate efficiently with hotkeys
Screenshots
Clean, modern chat interface |
Syntax-highlighted code output |
Integrated web search with styled results |
Light theme for daytime use |
Browse and manage Ollama models |
|
Quick Start
Prerequisites
One-Line Install
curl -fsSL https://somegit.dev/vikingowl/vessel/raw/main/install.sh | bash
Or Clone and Run
git clone https://somegit.dev/vikingowl/vessel.git
cd vessel
./install.sh
The installer will:
- Check for Docker, Docker Compose, and Ollama
- Start the frontend and backend services
- Optionally pull a starter model (llama3.2)
Once running, open http://localhost:7842 in your browser.
Installation
Option 1: Install Script (Recommended)
The install script handles everything automatically:
./install.sh # Install and start
./install.sh --update # Update to latest version
./install.sh --uninstall # Remove installation
Requirements:
- Ollama must be installed and running locally
- Docker and Docker Compose
- Linux or macOS
Option 2: Docker Compose (Manual)
# Make sure Ollama is running first
ollama serve
# Start Vessel
docker compose up -d
Option 3: Manual Setup (Development)
Prerequisites
Frontend
cd frontend
npm install
npm run dev
Frontend runs on http://localhost:5173
Backend
cd backend
go mod tidy
go run cmd/server/main.go -port 9090
Backend API runs on http://localhost:9090
Configuration
Environment Variables
Frontend
| Variable | Default | Description |
|---|---|---|
OLLAMA_API_URL |
http://localhost:11434 |
Ollama API endpoint |
BACKEND_URL |
http://localhost:9090 |
Vessel backend API |
Backend
| Variable | Default | Description |
|---|---|---|
OLLAMA_URL |
http://localhost:11434 |
Ollama API endpoint |
PORT |
8080 |
Backend server port |
GIN_MODE |
debug |
Gin mode (debug, release) |
Docker Compose Override
Create docker-compose.override.yml for local customizations:
services:
frontend:
environment:
- CUSTOM_VAR=value
ports:
- "3000:3000" # Different port
Architecture
vessel/
├── frontend/ # SvelteKit 5 application
│ ├── src/
│ │ ├── lib/
│ │ │ ├── components/ # UI components
│ │ │ ├── stores/ # Svelte 5 runes state
│ │ │ ├── tools/ # Built-in tool definitions
│ │ │ ├── storage/ # IndexedDB (Dexie)
│ │ │ └── api/ # API clients
│ │ └── routes/ # SvelteKit routes
│ └── Dockerfile
│
├── backend/ # Go API server
│ ├── cmd/server/ # Entry point
│ └── internal/
│ ├── api/ # HTTP handlers
│ │ ├── fetcher.go # URL fetching with wget/curl/chromedp
│ │ ├── search.go # Web search via DuckDuckGo
│ │ └── routes.go # Route definitions
│ ├── database/ # SQLite storage
│ └── models/ # Data models
│
├── docker-compose.yml # Production setup
└── docker-compose.dev.yml # Development with hot reload
Tech Stack
Frontend
- SvelteKit 5 — Full-stack framework
- Svelte 5 — Runes-based reactivity
- TypeScript — Type safety
- Tailwind CSS — Utility-first styling
- Skeleton UI — Component library
- Shiki — Syntax highlighting
- Dexie — IndexedDB wrapper
- Marked — Markdown parser
- DOMPurify — XSS sanitization
Backend
- Go 1.24 — Fast, compiled backend
- Gin — HTTP framework
- SQLite — Embedded database
- chromedp — Headless browser
Development
Running Tests
# Frontend unit tests
cd frontend
npm run test
# With coverage
npm run test:coverage
# Watch mode
npm run test:watch
Type Checking
cd frontend
npm run check
Development Mode
Use the dev compose file for hot reloading:
docker compose -f docker-compose.dev.yml up
API Reference
Backend Endpoints
| Method | Endpoint | Description |
|---|---|---|
POST |
/api/v1/proxy/search |
Web search via DuckDuckGo |
POST |
/api/v1/proxy/fetch |
Fetch URL content |
GET |
/api/v1/location |
Get user location from IP |
GET |
/api/v1/models/registry |
Browse Ollama model registry |
GET |
/api/v1/models/search |
Search models |
POST |
/api/v1/chats/sync |
Sync conversations |
Ollama Proxy
All requests to /ollama/* are proxied to the Ollama API, enabling CORS.
Roadmap
- Docker Ollama support (for systems without local Ollama)
- Image generation (Stable Diffusion, Hugging Face models)
- Hugging Face integration
- Voice input/output
- Multi-user support
- Plugin system
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
- Fork the repository
- Create your feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add some amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
License
Copyright (C) 2026 VikingOwl
This project is licensed under the GNU General Public License v3.0 - see the LICENSE file for details.




