Deletes the `owlen-ollama` Cargo.toml and source files, fully removing the Ollama provider from the workspace. This aligns the project with the MCP‑only architecture and eliminates direct provider dependencies.
5.4 KiB
Troubleshooting Guide
This guide is intended to help you with common issues you might encounter while using Owlen.
Connection Failures to Ollama
If you are unable to connect to a local Ollama instance, here are a few things to check:
- Is Ollama running? Make sure the Ollama service is active. You can usually check this with
ollama list. - Is the address correct? By default, Owlen tries to connect to
http://localhost:11434. If your Ollama instance is running on a different address or port, you will need to configure it in yourconfig.tomlfile. - Firewall issues: Ensure that your firewall is not blocking the connection.
- Health check warnings: Owlen now performs a provider health check on startup. If it fails, the error message will include a hint (either "start owlen-mcp-llm-server" or "ensure Ollama is running"). Resolve the hint and restart.
Model Not Found Errors
Owlen surfaces this as InvalidInput: Model '<name>' was not found.
- Local models: Run
ollama listto confirm the model name (e.g.,llama3:8b). Useollama pull <model>if it is missing. - Ollama Cloud: Names may differ from local installs. Double-check https://ollama.com/models and remove
-cloudsuffixes. - Fallback: Switch to
mode = "local_only"temporarily in[mcp]if the remote server is slow to update.
Fix the name in your configuration file or choose a model from the UI (:model).
Terminal Compatibility Issues
Owlen is built with ratatui, which supports most modern terminals. However, if you are experiencing rendering issues, please check the following:
- Your terminal supports Unicode.
- You are using a font that includes the characters being displayed.
- Try a different terminal emulator to see if the issue persists.
Configuration File Problems
If Owlen is not behaving as you expect, there might be an issue with your configuration file.
- Location: Run
owlen config pathto print the exact location (Linux, macOS, or Windows). Owlen now follows platform defaults instead of hard-coding~/.config. - Syntax: The configuration file is in TOML format. Make sure the syntax is correct.
- Values: Check that the values for your models, providers, and other settings are correct.
- Automation: Run
owlen config doctorto migrate legacy settings (mode = "legacy", missing providers) and validate the file before launching the TUI.
Ollama Cloud Authentication Errors
If you see Auth errors when using the hosted service:
- Run
owlen cloud setupto register your API key (with--api-keyfor non-interactive use). - Use
owlen cloud statusto verify Owlen can authenticate against Ollama Cloud. - Ensure
providers.ollama.api_keyis set or exportOLLAMA_API_KEY/OLLAMA_CLOUD_API_KEYwhen encryption is disabled. Withprivacy.encrypt_local_data = true, the key lives in the encrypted vault and is loaded automatically. - Confirm the key has access to the requested models.
- Avoid pasting extra quotes or whitespace into the config file—
owlen config doctorwill normalise the entry for you.
Linux Ollama Sign-In Workaround (v0.12.3)
Ollama v0.12.3 on Linux ships with a broken ollama signin command. Until you can upgrade to ≥0.12.4, use one of the manual workflows below to register your key pair.
1. Manual key copy
- Locate (or generate) keys on Linux
ls -la /usr/share/ollama/.ollama/ sudo systemctl start ollama # start the service if the directory is empty - Copy keys from a working Mac
# On macOS (source machine) cat ~/.ollama/id_ed25519.pub cat ~/.ollama/id_ed25519# On Linux (target machine) sudo systemctl stop ollama sudo mkdir -p /usr/share/ollama/.ollama sudo tee /usr/share/ollama/.ollama/id_ed25519.pub <<'EOF' <paste mac public key> EOF sudo tee /usr/share/ollama/.ollama/id_ed25519 <<'EOF' <paste mac private key> EOF sudo chown -R ollama:ollama /usr/share/ollama/.ollama/ sudo chmod 600 /usr/share/ollama/.ollama/id_ed25519 sudo chmod 644 /usr/share/ollama/.ollama/id_ed25519.pub sudo systemctl start ollama
2. Manual web registration
- Read the Linux public key:
sudo cat /usr/share/ollama/.ollama/id_ed25519.pub - Open https://ollama.com/settings/keys and paste the public key.
After either method, confirm access:
ollama list
Troubleshooting
- Permissions:
sudo chown -R ollama:ollama /usr/share/ollama/.ollama/then re-applychmod(600private,644public). - Service status:
sudo systemctl status ollamaandsudo journalctl -u ollama -f. - Alternate paths: Some distros run Ollama as a user process (
~/.ollama). Copy the keys into that directory if/usr/share/ollama/.ollamais unused.
This workaround mirrors what ollama signin should do—register the key pair with Ollama Cloud—without waiting for the patched release. Once you upgrade to v0.12.4 or newer, the interactive sign-in command works again.
Performance Tuning
If you are experiencing performance issues, you can try the following:
- Reduce context size: A smaller context size will result in faster responses from the LLM.
- Use a less resource-intensive model: Some models are faster but less capable than others.
If you are still having trouble, please open an issue on our GitHub repository.