- Include detailed architecture overview in `docs/architecture.md`. - Add `docs/configuration.md`, detailing configuration file structure and settings. - Provide a step-by-step provider implementation guide in `docs/provider-implementation.md`. - Add frequently asked questions (FAQ) document in `docs/faq.md`. - Create `docs/migration-guide.md` for future breaking changes and version upgrades. - Introduce new examples in `examples/` showcasing basic chat, custom providers, and theming. - Add a changelog (`CHANGELOG.md`) for tracking significant changes. - Provide contribution guidelines (`CONTRIBUTING.md`) and a Code of Conduct (`CODE_OF_CONDUCT.md`).
2.1 KiB
Troubleshooting Guide
This guide is intended to help you with common issues you might encounter while using Owlen.
Connection Failures to Ollama
If you are unable to connect to a local Ollama instance, here are a few things to check:
- Is Ollama running? Make sure the Ollama service is active. You can usually check this with
ollama list. - Is the address correct? By default, Owlen tries to connect to
http://localhost:11434. If your Ollama instance is running on a different address or port, you will need to configure it in yourconfig.tomlfile. - Firewall issues: Ensure that your firewall is not blocking the connection.
Model Not Found Errors
If you get a "model not found" error, it means that the model you are trying to use is not available. For local providers like Ollama, you can use ollama list to see the models you have downloaded. Make sure the model name in your Owlen configuration matches one of the available models.
Terminal Compatibility Issues
Owlen is built with ratatui, which supports most modern terminals. However, if you are experiencing rendering issues, please check the following:
- Your terminal supports Unicode.
- You are using a font that includes the characters being displayed.
- Try a different terminal emulator to see if the issue persists.
Configuration File Problems
If Owlen is not behaving as you expect, there might be an issue with your configuration file.
- Location: The configuration file is typically located at
~/.config/owlen/config.toml. - Syntax: The configuration file is in TOML format. Make sure the syntax is correct.
- Values: Check that the values for your models, providers, and other settings are correct.
Performance Tuning
If you are experiencing performance issues, you can try the following:
- Reduce context size: A smaller context size will result in faster responses from the LLM.
- Use a less resource-intensive model: Some models are faster but less capable than others.
If you are still having trouble, please open an issue on our GitHub repository.