Files
vessel/docker-compose.dev.yml
vikingowl de835b7af7 feat: initial commit - Ollama WebUI with tools, sync, and backend
Complete Ollama Web UI implementation featuring:

Frontend (SvelteKit + Svelte 5 + Tailwind CSS + Skeleton UI):
- Chat interface with streaming responses and markdown rendering
- Message tree with branching support (edit creates branches)
- Vision model support with image upload/paste
- Code syntax highlighting with Shiki
- Built-in tools: get_current_time, calculate, fetch_url
- Function model middleware (functiongemma) for tool routing
- IndexedDB storage with Dexie.js
- Context window tracking with token estimation
- Knowledge base with embeddings (RAG support)
- Keyboard shortcuts and responsive design
- Export conversations as Markdown/JSON

Backend (Go + Gin + SQLite):
- RESTful API for conversations and messages
- SQLite persistence with branching message tree
- Sync endpoints for IndexedDB ↔ SQLite synchronization
- URL proxy endpoint for CORS-bypassed web fetching
- Health check endpoint
- Docker support with host network mode

Infrastructure:
- Docker Compose for development and production
- Vite proxy configuration for Ollama and backend APIs
- Hot reload development setup

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2025-12-31 08:11:33 +01:00

28 lines
767 B
YAML

# Development docker-compose - uses host network for direct Ollama access
services:
frontend:
build:
context: ./frontend
dockerfile: Dockerfile.dev
# Use host network to access localhost:11434 and backend directly
network_mode: host
volumes:
- ./frontend:/app
- /app/node_modules
environment:
- OLLAMA_API_URL=http://localhost:11434
- BACKEND_URL=http://localhost:9090
depends_on:
- backend
backend:
build:
context: ./backend
dockerfile: Dockerfile
network_mode: host
volumes:
- ./backend/data:/app/data
environment:
- GIN_MODE=release
command: ["./server", "-port", "9090", "-db", "/app/data/ollama-webui.db", "-ollama-url", "http://localhost:11434"]