125 lines
6.6 KiB
Markdown
125 lines
6.6 KiB
Markdown
# Troubleshooting Guide
|
|
|
|
This guide is intended to help you with common issues you might encounter while using Owlen.
|
|
|
|
## Connection Failures to Ollama
|
|
|
|
If you are unable to connect to a local Ollama instance, here are a few things to check:
|
|
|
|
1. **Is Ollama running?** Make sure the Ollama service is active. You can usually check this with `ollama list`.
|
|
2. **Is the address correct?** By default, Owlen tries to connect to `http://localhost:11434`. If your Ollama instance is running on a different address or port, you will need to configure it in your `config.toml` file.
|
|
3. **Firewall issues:** Ensure that your firewall is not blocking the connection.
|
|
4. **Health check warnings:** Owlen now performs a provider health check on startup. If it fails, the error message will include a hint (either "start owlen-mcp-llm-server" or "ensure Ollama is running"). Resolve the hint and restart.
|
|
|
|
## Model Not Found Errors
|
|
|
|
Owlen surfaces this as `InvalidInput: Model '<name>' was not found`.
|
|
|
|
1. **Local models:** Run `ollama list` to confirm the model name (e.g., `llama3:8b`). Use `ollama pull <model>` if it is missing.
|
|
2. **Ollama Cloud:** Names may differ from local installs. Double-check https://ollama.com/models and remove `-cloud` suffixes.
|
|
3. **Fallback:** Switch to `mode = "local_only"` temporarily in `[mcp]` if the remote server is slow to update.
|
|
|
|
Fix the name in your configuration file or choose a model from the UI (`:model`).
|
|
|
|
## Local Models Missing After Cloud Setup
|
|
|
|
Owlen now queries both the local daemon and Ollama Cloud and shows them side-by-side in the picker. If you only see the cloud section (or a red `Local unavailable` banner):
|
|
|
|
1. **Confirm the daemon is reachable.** Run `ollama list` locally. If the command times out, restart the service (`ollama serve` or your systemd unit).
|
|
2. **Refresh the picker.** In the TUI press `:models --local` to focus the local section. The footer will explain if Owlen skipped the source because it was unreachable.
|
|
3. **Inspect the status line.** When the quick health probe fails, Owlen adds a `Local unavailable` / `Cloud unavailable` message instead of leaving the list blank. Use that hint to decide whether to restart Ollama or re-run `owlen cloud setup`.
|
|
4. **Keep the base URL local.** The cloud setup command no longer overrides `providers.ollama.base_url` unless `--force-cloud-base-url` is passed. If you changed it manually, edit `config.toml` or run `owlen config doctor` to restore the default `http://localhost:11434` value.
|
|
|
|
Once the daemon responds again, the picker will automatically merge the updated local list with the cloud catalogue.
|
|
|
|
## Terminal Compatibility Issues
|
|
|
|
Owlen is built with `ratatui`, which supports most modern terminals. However, if you are experiencing rendering issues, please check the following:
|
|
|
|
- Your terminal supports Unicode.
|
|
- You are using a font that includes the characters being displayed.
|
|
- Try a different terminal emulator to see if the issue persists.
|
|
|
|
## Configuration File Problems
|
|
|
|
If Owlen is not behaving as you expect, there might be an issue with your configuration file.
|
|
|
|
- **Location:** Run `owlen config path` to print the exact location (Linux, macOS, or Windows). Owlen now follows platform defaults instead of hard-coding `~/.config`.
|
|
- **Syntax:** The configuration file is in TOML format. Make sure the syntax is correct.
|
|
- **Values:** Check that the values for your models, providers, and other settings are correct.
|
|
- **Automation:** Run `owlen config doctor` to migrate legacy settings (`mode = "legacy"`, missing providers) and validate the file before launching the TUI.
|
|
|
|
## Ollama Cloud Authentication Errors
|
|
|
|
If you see `Auth` errors when using the hosted service:
|
|
|
|
1. Run `owlen cloud setup` to register your API key (with `--api-key` for non-interactive use).
|
|
2. Use `owlen cloud status` to verify Owlen can authenticate against [Ollama Cloud](https://docs.ollama.com/cloud).
|
|
3. Ensure `providers.ollama.api_key` is set **or** export `OLLAMA_API_KEY` / `OLLAMA_CLOUD_API_KEY` when encryption is disabled. With `privacy.encrypt_local_data = true`, the key lives in the encrypted vault and is loaded automatically.
|
|
4. Confirm the key has access to the requested models.
|
|
5. Avoid pasting extra quotes or whitespace into the config file—`owlen config doctor` will normalise the entry for you.
|
|
|
|
### Linux Ollama Sign-In Workaround (v0.12.3)
|
|
|
|
Ollama v0.12.3 on Linux ships with a broken `ollama signin` command. Until you can upgrade to ≥0.12.4, use one of the manual workflows below to register your key pair.
|
|
|
|
#### 1. Manual key copy
|
|
|
|
1. **Locate (or generate) keys on Linux**
|
|
```bash
|
|
ls -la /usr/share/ollama/.ollama/
|
|
sudo systemctl start ollama # start the service if the directory is empty
|
|
```
|
|
2. **Copy keys from a working Mac**
|
|
```bash
|
|
# On macOS (source machine)
|
|
cat ~/.ollama/id_ed25519.pub
|
|
cat ~/.ollama/id_ed25519
|
|
```
|
|
```bash
|
|
# On Linux (target machine)
|
|
sudo systemctl stop ollama
|
|
sudo mkdir -p /usr/share/ollama/.ollama
|
|
sudo tee /usr/share/ollama/.ollama/id_ed25519.pub <<'EOF'
|
|
<paste mac public key>
|
|
EOF
|
|
sudo tee /usr/share/ollama/.ollama/id_ed25519 <<'EOF'
|
|
<paste mac private key>
|
|
EOF
|
|
sudo chown -R ollama:ollama /usr/share/ollama/.ollama/
|
|
sudo chmod 600 /usr/share/ollama/.ollama/id_ed25519
|
|
sudo chmod 644 /usr/share/ollama/.ollama/id_ed25519.pub
|
|
sudo systemctl start ollama
|
|
```
|
|
|
|
#### 2. Manual web registration
|
|
|
|
1. Read the Linux public key:
|
|
```bash
|
|
sudo cat /usr/share/ollama/.ollama/id_ed25519.pub
|
|
```
|
|
2. Open <https://ollama.com/settings/keys> and paste the public key.
|
|
|
|
After either method, confirm access:
|
|
|
|
```bash
|
|
ollama list
|
|
```
|
|
|
|
#### Troubleshooting
|
|
|
|
- Permissions: `sudo chown -R ollama:ollama /usr/share/ollama/.ollama/` then re-apply `chmod` (`600` private, `644` public).
|
|
- Service status: `sudo systemctl status ollama` and `sudo journalctl -u ollama -f`.
|
|
- Alternate paths: Some distros run Ollama as a user process (`~/.ollama`). Copy the keys into that directory if `/usr/share/ollama/.ollama` is unused.
|
|
|
|
This workaround mirrors what `ollama signin` should do—register the key pair with Ollama Cloud—without waiting for the patched release. Once you upgrade to v0.12.4 or newer, the interactive sign-in command works again.
|
|
|
|
## Performance Tuning
|
|
|
|
If you are experiencing performance issues, you can try the following:
|
|
|
|
- **Reduce context size:** A smaller context size will result in faster responses from the LLM.
|
|
- **Use a less resource-intensive model:** Some models are faster but less capable than others.
|
|
|
|
If you are still having trouble, please [open an issue](https://github.com/Owlibou/owlen/issues) on our GitHub repository.
|