204 lines
6.9 KiB
Markdown
204 lines
6.9 KiB
Markdown
# Migration Guide
|
||
|
||
This guide documents breaking changes between versions of Owlen and provides instructions on how to migrate your configuration or usage.
|
||
|
||
As Owlen is currently in its alpha phase (pre-v1.0), breaking changes may occur more frequently. We will do our best to document them here.
|
||
|
||
---
|
||
|
||
## Migrating from v0.x to v1.0 (MCP-Only Architecture)
|
||
|
||
**Version 1.0** marks a major milestone: Owlen has completed its transition to a **MCP-only architecture** (Model Context Protocol). This brings significant improvements in modularity, extensibility, and performance, but requires configuration updates.
|
||
|
||
### Breaking Changes
|
||
|
||
#### 1. MCP Mode now defaults to `remote_preferred`
|
||
|
||
The `[mcp]` section in `config.toml` still accepts a `mode` setting, but the default behaviour has changed. If you previously relied on `mode = "legacy"`, you can keep that line – the value now maps to the `local_only` runtime with a compatibility warning instead of breaking outright. New installs default to the safer `remote_preferred` mode, which attempts to use any configured external MCP server and automatically falls back to the local in-process tooling when permitted.
|
||
|
||
**Supported values (v1.0+):**
|
||
|
||
| Value | Behaviour |
|
||
|--------------------|-----------|
|
||
| `remote_preferred` | Default. Use the first configured `[[mcp_servers]]`, fall back to local if `allow_fallback = true`.
|
||
| `remote_only` | Require a configured server; the CLI will error if it cannot start.
|
||
| `local_only` | Force the built-in MCP client and the direct Ollama provider.
|
||
| `legacy` | Alias for `local_only` kept for compatibility (emits a warning).
|
||
| `disabled` | Not supported by the TUI; intended for headless tooling.
|
||
|
||
You can additionally control the automatic fallback behaviour:
|
||
|
||
```toml
|
||
[mcp]
|
||
mode = "remote_preferred"
|
||
allow_fallback = true
|
||
warn_on_legacy = true
|
||
```
|
||
|
||
#### 2. Direct Provider Access Removed (with opt-in compatibility)
|
||
|
||
In v0.x, Owlen could make direct HTTP calls to Ollama when in "legacy" mode. The default v1.0 behaviour keeps all LLM interactions behind MCP, but choosing `mode = "local_only"` or `mode = "legacy"` now reinstates the direct Ollama provider while still keeping the MCP tooling stack available locally.
|
||
|
||
### What Changed Under the Hood
|
||
|
||
The v1.0 architecture implements the full 10-phase migration plan:
|
||
|
||
- **Phase 1-2**: File operations via MCP servers
|
||
- **Phase 3**: LLM inference via MCP servers (Ollama wrapped)
|
||
- **Phase 4**: Agent loop with ReAct pattern
|
||
- **Phase 5**: Mode system (chat/code) with tool availability
|
||
- **Phase 6**: Web search integration
|
||
- **Phase 7**: Code execution with Docker sandboxing
|
||
- **Phase 8**: Prompt server for versioned prompts
|
||
- **Phase 9**: Remote MCP server support (HTTP/WebSocket)
|
||
- **Phase 10**: Legacy mode removal and production polish
|
||
|
||
### Migration Steps
|
||
|
||
#### Step 1: Review Your MCP Configuration
|
||
|
||
Edit `~/.config/owlen/config.toml` and ensure the `[mcp]` section reflects how you want to run Owlen:
|
||
|
||
```toml
|
||
[mcp]
|
||
mode = "remote_preferred"
|
||
allow_fallback = true
|
||
```
|
||
|
||
If you encounter issues with remote servers, you can temporarily switch to:
|
||
|
||
```toml
|
||
[mcp]
|
||
mode = "local_only" # or "legacy" for backwards compatibility
|
||
```
|
||
|
||
You will see a warning on startup when `legacy` is used so you remember to migrate later.
|
||
|
||
**Quick fix:** run `owlen config doctor` to apply these defaults automatically and validate your configuration file.
|
||
|
||
#### Step 2: Verify Provider Configuration
|
||
|
||
Ensure your provider configuration is correct. For Ollama:
|
||
|
||
```toml
|
||
[general]
|
||
default_provider = "ollama"
|
||
default_model = "llama3.2:latest" # or your preferred model
|
||
|
||
[providers.ollama]
|
||
provider_type = "ollama"
|
||
base_url = "http://localhost:11434"
|
||
|
||
[providers.ollama-cloud]
|
||
provider_type = "ollama-cloud"
|
||
base_url = "https://ollama.com"
|
||
api_key = "$OLLAMA_API_KEY" # Optional: for Ollama Cloud
|
||
```
|
||
|
||
#### Step 3: Understanding MCP Server Configuration
|
||
|
||
While not required for basic usage (Owlen will use the built-in local MCP client), you can optionally configure external MCP servers:
|
||
|
||
```toml
|
||
[[mcp_servers]]
|
||
name = "llm"
|
||
command = "owlen-mcp-llm-server"
|
||
transport = "stdio"
|
||
|
||
[[mcp_servers]]
|
||
name = "filesystem"
|
||
command = "/path/to/filesystem-server"
|
||
transport = "stdio"
|
||
```
|
||
|
||
**Note**: If no `mcp_servers` are configured, Owlen automatically falls back to its built-in local MCP client, which provides the same functionality.
|
||
|
||
#### Step 4: Verify Installation
|
||
|
||
After updating your config:
|
||
|
||
1. **Check Ollama is running**:
|
||
```bash
|
||
curl http://localhost:11434/api/version
|
||
```
|
||
|
||
2. **List available models**:
|
||
```bash
|
||
ollama list
|
||
```
|
||
|
||
3. **Test Owlen**:
|
||
```bash
|
||
owlen
|
||
```
|
||
|
||
### Common Issues After Migration
|
||
|
||
#### Issue: "Warning: No MCP servers defined in config. Using local client."
|
||
|
||
**This is normal!** In v1.0+, if you don't configure external MCP servers, Owlen uses its built-in local MCP client. This provides the same functionality without needing separate server processes.
|
||
|
||
**No action required** unless you specifically want to use external MCP servers.
|
||
|
||
#### Issue: Timeouts on First Message
|
||
|
||
**Cause**: Ollama loads models into memory on first use, which can take 10-60 seconds for large models.
|
||
|
||
**Solution**:
|
||
- Be patient on first inference after model selection
|
||
- Use smaller models for faster loading (e.g., `llama3.2:latest` instead of `qwen3-coder:latest`)
|
||
- Pre-load models with: `ollama run <model-name>`
|
||
|
||
#### Issue: Cloud Models Return 404 Errors
|
||
|
||
**Cause**: Ollama Cloud model names may differ from local model names.
|
||
|
||
**Solution**:
|
||
- Verify model availability on https://ollama.com/models
|
||
- Remove the `-cloud` suffix from model names when using cloud provider
|
||
- Ensure `api_key` is set in `[providers.ollama-cloud]` config
|
||
|
||
### Rollback to v0.x
|
||
|
||
If you encounter issues and need to rollback:
|
||
|
||
1. **Reinstall v0.x**:
|
||
```bash
|
||
# Using AUR (if applicable)
|
||
yay -S owlen-git
|
||
|
||
# Or from source
|
||
git checkout <v0.x-tag>
|
||
cargo install --path crates/owlen-tui
|
||
```
|
||
|
||
2. **Restore configuration**:
|
||
```toml
|
||
[mcp]
|
||
mode = "legacy"
|
||
```
|
||
|
||
3. **Report issues**: https://github.com/Owlibou/owlen/issues
|
||
|
||
### Benefits of v1.0 MCP Architecture
|
||
|
||
- **Modularity**: LLM, file operations, and tools are isolated in MCP servers
|
||
- **Extensibility**: Easy to add new tools and capabilities via MCP protocol
|
||
- **Multi-Provider**: Support for multiple LLM providers through standard interface
|
||
- **Remote Execution**: Can connect to remote MCP servers over HTTP/WebSocket
|
||
- **Better Error Handling**: Structured error responses from MCP servers
|
||
- **Agentic Capabilities**: ReAct pattern for autonomous task completion
|
||
|
||
### Getting Help
|
||
|
||
- **Documentation**: See `docs/` directory for detailed guides
|
||
- **Issues**: https://github.com/Owlibou/owlen/issues
|
||
- **Configuration Reference**: `docs/configuration.md`
|
||
- **Troubleshooting**: `docs/troubleshooting.md`
|
||
|
||
---
|
||
|
||
## Future Migrations
|
||
|
||
We will continue to document breaking changes here as Owlen evolves. Always check this guide when upgrading to a new major version.
|