Files
owlen/docs/troubleshooting.md
vikingowl 3f6d7d56f6 feat(ui): add glass modals and theme preview
AC:\n- Theme, help, command, and model modals share the glass chrome.\n- Theme selector shows a live preview for the highlighted palette.\n- Updated docs and screenshots explain the refreshed cockpit.\n\nTests:\n- cargo test -p owlen-tui
2025-10-24 02:54:19 +02:00

6.9 KiB

Troubleshooting Guide

This guide is intended to help you with common issues you might encounter while using Owlen.

Connection Failures to Ollama

If you are unable to connect to a local Ollama instance, here are a few things to check:

  1. Is Ollama running? Make sure the Ollama service is active. You can usually check this with ollama list.
  2. Is the address correct? By default, Owlen tries to connect to http://localhost:11434. If your Ollama instance is running on a different address or port, you will need to configure it in your config.toml file.
  3. Firewall issues: Ensure that your firewall is not blocking the connection.
  4. Health check warnings: Owlen now performs a provider health check on startup. If it fails, the error message will include a hint (either "start owlen-mcp-llm-server" or "ensure Ollama is running"). Resolve the hint and restart.

Model Not Found Errors

Owlen surfaces this as InvalidInput: Model '<name>' was not found.

  1. Local models: Run ollama list to confirm the model name (e.g., llama3:8b). Use ollama pull <model> if it is missing.
  2. Ollama Cloud: Names may differ from local installs. Double-check https://ollama.com/models and remove -cloud suffixes.
  3. Fallback: Switch to mode = "local_only" temporarily in [mcp] if the remote server is slow to update.

Fix the name in your configuration file or choose a model from the UI (:model).

Local Models Missing After Cloud Setup

Owlen now queries both the local daemon and Ollama Cloud and shows them side-by-side in the picker. If you only see the cloud section (or a red Local unavailable banner):

  1. Confirm the daemon is reachable. Run ollama list locally. If the command times out, restart the service (ollama serve or your systemd unit).
  2. Refresh the picker. In the TUI press :models --local to focus the local section. The footer will explain if Owlen skipped the source because it was unreachable.
  3. Inspect the status line. When the quick health probe fails, Owlen adds a Local unavailable / Cloud unavailable message instead of leaving the list blank. Use that hint to decide whether to restart Ollama or re-run owlen cloud setup.
  4. Keep the base URL local. The cloud setup command no longer overrides providers.ollama.base_url unless --force-cloud-base-url is passed. If you changed it manually, edit config.toml or run owlen config doctor to restore the default http://localhost:11434 value.

Once the daemon responds again, the picker will automatically merge the updated local list with the cloud catalogue. Owlen runs a background health worker every 30 seconds; once the daemon responds it will update the picker automatically without needing a restart.

Terminal Compatibility Issues

Owlen is built with ratatui, which supports most modern terminals. However, if you are experiencing rendering issues, please check the following:

  • Your terminal supports Unicode.
  • You are using a font that includes the characters being displayed.
  • Try a different terminal emulator to see if the issue persists.

Configuration File Problems

If Owlen is not behaving as you expect, there might be an issue with your configuration file.

  • Location: Run owlen config path to print the exact location (Linux, macOS, or Windows). Owlen now follows platform defaults instead of hard-coding ~/.config.
  • Syntax: The configuration file is in TOML format. Make sure the syntax is correct.
  • Values: Check that the values for your models, providers, and other settings are correct.
  • Automation: Run owlen config doctor to migrate legacy settings (mode = "legacy", missing providers) and validate the file before launching the TUI.

Ollama Cloud Authentication Errors

If you see Auth errors when using the hosted service:

  1. Run owlen cloud setup to register your API key (with --api-key for non-interactive use).
  2. Use owlen cloud status to verify Owlen can authenticate against Ollama Cloud.
  3. Ensure providers.ollama.api_key is set or export OLLAMA_API_KEY (legacy: OLLAMA_CLOUD_API_KEY / OWLEN_OLLAMA_CLOUD_API_KEY) when encryption is disabled. With privacy.encrypt_local_data = true, the key lives in the encrypted vault and is loaded automatically.
  4. Confirm the key has access to the requested models.
  5. Avoid pasting extra quotes or whitespace into the config file—owlen config doctor will normalise the entry for you.

Linux Ollama Sign-In Workaround (v0.12.3)

Ollama v0.12.3 on Linux ships with a broken ollama signin command. Until you can upgrade to ≥0.12.4, use one of the manual workflows below to register your key pair.

1. Manual key copy

  1. Locate (or generate) keys on Linux
    ls -la /usr/share/ollama/.ollama/
    sudo systemctl start ollama   # start the service if the directory is empty
    
  2. Copy keys from a working Mac
    # On macOS (source machine)
    cat ~/.ollama/id_ed25519.pub
    cat ~/.ollama/id_ed25519
    
    # On Linux (target machine)
    sudo systemctl stop ollama
    sudo mkdir -p /usr/share/ollama/.ollama
    sudo tee /usr/share/ollama/.ollama/id_ed25519.pub <<'EOF'
    <paste mac public key>
    EOF
    sudo tee /usr/share/ollama/.ollama/id_ed25519 <<'EOF'
    <paste mac private key>
    EOF
    sudo chown -R ollama:ollama /usr/share/ollama/.ollama/
    sudo chmod 600 /usr/share/ollama/.ollama/id_ed25519
    sudo chmod 644 /usr/share/ollama/.ollama/id_ed25519.pub
    sudo systemctl start ollama
    

2. Manual web registration

  1. Read the Linux public key:
    sudo cat /usr/share/ollama/.ollama/id_ed25519.pub
    
  2. Open https://ollama.com/settings/keys and paste the public key.

After either method, confirm access:

ollama list

Troubleshooting

  • Permissions: sudo chown -R ollama:ollama /usr/share/ollama/.ollama/ then re-apply chmod (600 private, 644 public).
  • Service status: sudo systemctl status ollama and sudo journalctl -u ollama -f.
  • Alternate paths: Some distros run Ollama as a user process (~/.ollama). Copy the keys into that directory if /usr/share/ollama/.ollama is unused.

This workaround mirrors what ollama signin should do—register the key pair with Ollama Cloud—without waiting for the patched release. Once you upgrade to v0.12.4 or newer, the interactive sign-in command works again.

Performance Tuning

If you are experiencing performance issues, you can try the following:

  • Reduce context size: A smaller context size will result in faster responses from the LLM.
  • Use a less resource-intensive model: Some models are faster but less capable than others.
  • Watch the header gauges: The cockpit header now shows live context usage and cloud quota bands—if either bar turns amber or red, trim the prompt or switch providers before retrying.

If you are still having trouble, please open an issue on our GitHub repository.