Docker compatibility:
- Discovery endpoints now read from OLLAMA_URL, LLAMACPP_URL, LMSTUDIO_URL env vars
- docker-compose.yml sets backends to host.docker.internal for container access
- justfile updated with --host 0.0.0.0 for llama-server
Vision support:
- OpenAI adapter now converts images to content parts array format
- Enables vision models with llama.cpp and LM Studio
Bumps version to 0.7.1
- Add curated prompt templates with categories (coding, writing, analysis,
creative, assistant) that users can browse and add to their library
- Add "Browse Templates" tab to the Prompts page with category filtering
and preview functionality
- Add Design Brief Generator tool template for creating structured design
briefs from project requirements
- Add Color Palette Generator tool template for generating harmonious
color schemes from a base color
Prompts included: Code Reviewer, Refactoring Expert, Debug Assistant,
API Designer, SQL Expert, Technical Writer, Marketing Copywriter,
UI/UX Advisor, Security Auditor, Data Analyst, Creative Brainstormer,
Storyteller, Concise Assistant, Patient Teacher, Devil's Advocate,
Meeting Summarizer
Adds the ability to test HTTP endpoint custom tools directly in the
editor, matching the existing test functionality for Python and
JavaScript tools. Closes #6.
- Add postinstall script to copy worker to static/
- Update Dockerfile to copy worker during build
- Update file-processor to try local worker first, fallback to CDN
- Bump version to 0.4.11
- Add capability verification for installed models using /api/show
- SyncModels now updates real capabilities when fetchDetails=true
- Model browser shows verified/unverified badges for capabilities
- Add info notice that capabilities are sourced from ollama.com
- Fix incorrect capability data (e.g., deepseek-r1 "tools" badge)
Capabilities from ollama.com website may be inaccurate. Once a model
is installed, Vessel fetches actual capabilities from Ollama runtime
and displays a "verified" badge in the model details panel.
- Add size filter (≤3B, 4-13B, 14-70B, >70B) based on model tags
- Add model family filter dropdown with dynamic family list
- Display last updated date on model cards (scraped from ollama.com)
- Add /api/v1/models/remote/families endpoint
- Convert relative time strings ("2 weeks ago") to timestamps during sync
- Add CodeMirror editor with syntax highlighting for JavaScript and Python
- Add 8 starter templates (4 JS, 4 Python) for common tool patterns
- Add inline documentation panel with language-specific guidance
- Add tool testing UI to run tools with sample inputs before saving
- Add Python tool execution via backend API with 30s timeout
- Add POST /api/v1/tools/execute endpoint for backend tool execution
- Update Dockerfile to include Python 3 for tool execution
- Bump version to 0.4.0
Auto-scroll now stops when the top of the assistant's response
reaches the top of the viewport, allowing users to read from
the beginning while more content streams below.
- Update package.json name to "vessel"
- Update storage keys (vessel-settings, vessel IndexedDB)
- Update Go module to vessel-backend with new imports
- Update database path to vessel.db
- Add new Vessel "V" icon (favicon + app icons)
- Update all user-facing branding (titles, sidebar, settings)
- Update docker-compose files with vessel naming and network
- Change accent color from emerald to violet
🤖 Generated with [Claude Code](https://claude.com/claude-code)
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Testing:
- Add vitest with jsdom environment for unit testing
- Create 61 comprehensive tests for keyboard shortcuts
- Add test helpers for platform switching and key simulation
- Mock SvelteKit $app modules for testing
Fix:
- Change Windows/Linux shortcuts from Ctrl to Alt to avoid
browser shortcut conflicts (Ctrl+N opens new browser window)
- Mac shortcuts remain Cmd+N/K/B (unaffected)
New shortcuts on Windows/Linux:
- Alt+N: New chat
- Alt+K: Search conversations
- Alt+B: Toggle sidebar
🤖 Generated with [Claude Code](https://claude.com/claude-code)
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Thinking mode:
- Add native Ollama `think: true` API parameter support
- Create ThinkingBlock component with collapsible UI and streaming indicator
- Allow expanding/collapsing thinking blocks during streaming
- Pass showThinking prop through component chain to hide when disabled
- Auto-generate smart chat titles using LLM after first response
File uploads:
- Add FileUpload component supporting images, text files, and PDFs
- Create FilePreview component for non-image attachments
- Add file-processor utility for text extraction and PDF parsing
- Text/PDF content injected as context for all models
Model capabilities:
- Add ModelCapabilityIcons component showing vision/tools/code badges
- Detect model capabilities from name patterns in models store
- Display capability icons in model selector dropdown
🤖 Generated with [Claude Code](https://claude.com/claude-code)
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Complete Ollama Web UI implementation featuring:
Frontend (SvelteKit + Svelte 5 + Tailwind CSS + Skeleton UI):
- Chat interface with streaming responses and markdown rendering
- Message tree with branching support (edit creates branches)
- Vision model support with image upload/paste
- Code syntax highlighting with Shiki
- Built-in tools: get_current_time, calculate, fetch_url
- Function model middleware (functiongemma) for tool routing
- IndexedDB storage with Dexie.js
- Context window tracking with token estimation
- Knowledge base with embeddings (RAG support)
- Keyboard shortcuts and responsive design
- Export conversations as Markdown/JSON
Backend (Go + Gin + SQLite):
- RESTful API for conversations and messages
- SQLite persistence with branching message tree
- Sync endpoints for IndexedDB ↔ SQLite synchronization
- URL proxy endpoint for CORS-bypassed web fetching
- Health check endpoint
- Docker support with host network mode
Infrastructure:
- Docker Compose for development and production
- Vite proxy configuration for Ollama and backend APIs
- Hot reload development setup
🤖 Generated with [Claude Code](https://claude.com/claude-code)
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>