Vessel
A modern, feature-rich web interface for Ollama
Features •
Screenshots •
Quick Start •
Installation •
Configuration •
Development
---
## Features
### Core Chat Experience
- **Real-time streaming** — Watch responses appear token by token
- **Conversation history** — All chats stored locally in IndexedDB
- **Message editing** — Edit any message and regenerate responses with branching
- **Branch navigation** — Explore different response paths from edited messages
- **Markdown rendering** — Full GFM support with tables, lists, and formatting
- **Syntax highlighting** — Beautiful code blocks powered by Shiki with 100+ languages
- **Dark/Light mode** — Seamless theme switching with system preference detection
### Built-in Tools (Function Calling)
Vessel includes five powerful tools that models can invoke automatically:
| Tool | Description |
|------|-------------|
| **Web Search** | Search the internet for current information, news, weather, prices |
| **Fetch URL** | Read and extract content from any webpage |
| **Calculator** | Safe math expression parser with functions (sqrt, sin, cos, log, etc.) |
| **Get Location** | Detect user location via GPS or IP for local queries |
| **Get Time** | Current date/time with timezone support |
### Model Management
- **Model browser** — Browse, search, and pull models from Ollama registry
- **Live status** — See which models are currently loaded in memory
- **Quick switch** — Change models mid-conversation
- **Model metadata** — View parameters, quantization, and capabilities
### Developer Experience
- **Beautiful code generation** — Syntax-highlighted output for any language
- **Copy code blocks** — One-click copy with visual feedback
- **Scroll to bottom** — Smart auto-scroll with manual override
- **Keyboard shortcuts** — Navigate efficiently with hotkeys
---
## Screenshots
Clean, modern chat interface
|
Syntax-highlighted code output
|
Integrated web search with styled results
|
Light theme for daytime use
|
Browse and manage Ollama models
|
---
## Quick Start
### One-Line Install
```bash
curl -fsSL https://somegit.dev/vikingowl/vessel/raw/main/install.sh | bash
```
### Or Clone and Run
```bash
git clone https://somegit.dev/vikingowl/vessel.git
cd vessel
./install.sh
```
The installer will:
- Check for Docker and Docker Compose
- Detect if you have Ollama installed locally (and let you choose)
- Start all services
- Optionally pull a starter model (llama3.2)
Once running, open **http://localhost:7842** in your browser.
---
## Installation
### Option 1: Install Script (Recommended)
The install script handles everything automatically:
```bash
./install.sh # Install and start
./install.sh --update # Update to latest version
./install.sh --uninstall # Remove installation
```
**Features:**
- Detects local Ollama installation
- Configures Docker networking automatically
- Works on Linux and macOS
### Option 2: Docker Compose (Manual)
```bash
docker compose up -d
```
#### With GPU Support (NVIDIA)
Uncomment the GPU section in `docker-compose.yml`:
```yaml
ollama:
image: ollama/ollama:latest
deploy:
resources:
reservations:
devices:
- driver: nvidia
count: all
capabilities: [gpu]
```
Then run:
```bash
docker compose up -d
```
### Option 2: Manual Setup
#### Prerequisites
- [Node.js](https://nodejs.org/) 20+
- [Go](https://go.dev/) 1.24+
- [Ollama](https://ollama.com/) running locally
#### Frontend
```bash
cd frontend
npm install
npm run dev
```
Frontend runs on `http://localhost:5173`
#### Backend
```bash
cd backend
go mod tidy
go run cmd/server/main.go -port 9090
```
Backend API runs on `http://localhost:9090`
---
## Configuration
### Environment Variables
#### Frontend
| Variable | Default | Description |
|----------|---------|-------------|
| `OLLAMA_API_URL` | `http://localhost:11434` | Ollama API endpoint |
| `BACKEND_URL` | `http://localhost:9090` | Vessel backend API |
#### Backend
| Variable | Default | Description |
|----------|---------|-------------|
| `OLLAMA_URL` | `http://localhost:11434` | Ollama API endpoint |
| `PORT` | `8080` | Backend server port |
| `GIN_MODE` | `debug` | Gin mode (`debug`, `release`) |
### Docker Compose Override
Create `docker-compose.override.yml` for local customizations:
```yaml
services:
frontend:
environment:
- CUSTOM_VAR=value
ports:
- "3000:3000" # Different port
```
---
## Architecture
```
vessel/
├── frontend/ # SvelteKit 5 application
│ ├── src/
│ │ ├── lib/
│ │ │ ├── components/ # UI components
│ │ │ ├── stores/ # Svelte 5 runes state
│ │ │ ├── tools/ # Built-in tool definitions
│ │ │ ├── storage/ # IndexedDB (Dexie)
│ │ │ └── api/ # API clients
│ │ └── routes/ # SvelteKit routes
│ └── Dockerfile
│
├── backend/ # Go API server
│ ├── cmd/server/ # Entry point
│ └── internal/
│ ├── api/ # HTTP handlers
│ │ ├── fetcher.go # URL fetching with wget/curl/chromedp
│ │ ├── search.go # Web search via DuckDuckGo
│ │ └── routes.go # Route definitions
│ ├── database/ # SQLite storage
│ └── models/ # Data models
│
├── docker-compose.yml # Production setup
└── docker-compose.dev.yml # Development with hot reload
```
---
## Tech Stack
### Frontend
- **[SvelteKit 5](https://kit.svelte.dev/)** — Full-stack framework
- **[Svelte 5](https://svelte.dev/)** — Runes-based reactivity
- **[TypeScript](https://www.typescriptlang.org/)** — Type safety
- **[Tailwind CSS](https://tailwindcss.com/)** — Utility-first styling
- **[Skeleton UI](https://skeleton.dev/)** — Component library
- **[Shiki](https://shiki.matsu.io/)** — Syntax highlighting
- **[Dexie](https://dexie.org/)** — IndexedDB wrapper
- **[Marked](https://marked.js.org/)** — Markdown parser
- **[DOMPurify](https://github.com/cure53/DOMPurify)** — XSS sanitization
### Backend
- **[Go 1.24](https://go.dev/)** — Fast, compiled backend
- **[Gin](https://gin-gonic.com/)** — HTTP framework
- **[SQLite](https://sqlite.org/)** — Embedded database
- **[chromedp](https://github.com/chromedp/chromedp)** — Headless browser
---
## Development
### Running Tests
```bash
# Frontend unit tests
cd frontend
npm run test
# With coverage
npm run test:coverage
# Watch mode
npm run test:watch
```
### Type Checking
```bash
cd frontend
npm run check
```
### Development Mode
Use the dev compose file for hot reloading:
```bash
docker compose -f docker-compose.dev.yml up
```
---
## API Reference
### Backend Endpoints
| Method | Endpoint | Description |
|--------|----------|-------------|
| `POST` | `/api/v1/proxy/search` | Web search via DuckDuckGo |
| `POST` | `/api/v1/proxy/fetch` | Fetch URL content |
| `GET` | `/api/v1/location` | Get user location from IP |
| `GET` | `/api/v1/models/registry` | Browse Ollama model registry |
| `GET` | `/api/v1/models/search` | Search models |
| `POST` | `/api/v1/chats/sync` | Sync conversations |
### Ollama Proxy
All requests to `/ollama/*` are proxied to the Ollama API, enabling CORS.
---
## Roadmap
- [ ] Image generation (Stable Diffusion, Hugging Face models)
- [ ] Hugging Face integration
- [ ] Voice input/output
- [ ] Multi-user support
- [ ] Plugin system
---
## Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
1. Fork the repository
2. Create your feature branch (`git checkout -b feature/amazing-feature`)
3. Commit your changes (`git commit -m 'Add some amazing feature'`)
4. Push to the branch (`git push origin feature/amazing-feature`)
5. Open a Pull Request
---
## License
Copyright (C) 2026 VikingOwl
This project is licensed under the GNU General Public License v3.0 - see the [LICENSE](LICENSE) file for details.
---
Made with Ollama and Svelte