Add comprehensive documentation and examples for Owlen architecture and usage
- Include detailed architecture overview in `docs/architecture.md`. - Add `docs/configuration.md`, detailing configuration file structure and settings. - Provide a step-by-step provider implementation guide in `docs/provider-implementation.md`. - Add frequently asked questions (FAQ) document in `docs/faq.md`. - Create `docs/migration-guide.md` for future breaking changes and version upgrades. - Introduce new examples in `examples/` showcasing basic chat, custom providers, and theming. - Add a changelog (`CHANGELOG.md`) for tracking significant changes. - Provide contribution guidelines (`CONTRIBUTING.md`) and a Code of Conduct (`CODE_OF_CONDUCT.md`).
This commit is contained in:
40
docs/troubleshooting.md
Normal file
40
docs/troubleshooting.md
Normal file
@@ -0,0 +1,40 @@
|
||||
# Troubleshooting Guide
|
||||
|
||||
This guide is intended to help you with common issues you might encounter while using Owlen.
|
||||
|
||||
## Connection Failures to Ollama
|
||||
|
||||
If you are unable to connect to a local Ollama instance, here are a few things to check:
|
||||
|
||||
1. **Is Ollama running?** Make sure the Ollama service is active. You can usually check this with `ollama list`.
|
||||
2. **Is the address correct?** By default, Owlen tries to connect to `http://localhost:11434`. If your Ollama instance is running on a different address or port, you will need to configure it in your `config.toml` file.
|
||||
3. **Firewall issues:** Ensure that your firewall is not blocking the connection.
|
||||
|
||||
## Model Not Found Errors
|
||||
|
||||
If you get a "model not found" error, it means that the model you are trying to use is not available. For local providers like Ollama, you can use `ollama list` to see the models you have downloaded. Make sure the model name in your Owlen configuration matches one of the available models.
|
||||
|
||||
## Terminal Compatibility Issues
|
||||
|
||||
Owlen is built with `ratatui`, which supports most modern terminals. However, if you are experiencing rendering issues, please check the following:
|
||||
|
||||
- Your terminal supports Unicode.
|
||||
- You are using a font that includes the characters being displayed.
|
||||
- Try a different terminal emulator to see if the issue persists.
|
||||
|
||||
## Configuration File Problems
|
||||
|
||||
If Owlen is not behaving as you expect, there might be an issue with your configuration file.
|
||||
|
||||
- **Location:** The configuration file is typically located at `~/.config/owlen/config.toml`.
|
||||
- **Syntax:** The configuration file is in TOML format. Make sure the syntax is correct.
|
||||
- **Values:** Check that the values for your models, providers, and other settings are correct.
|
||||
|
||||
## Performance Tuning
|
||||
|
||||
If you are experiencing performance issues, you can try the following:
|
||||
|
||||
- **Reduce context size:** A smaller context size will result in faster responses from the LLM.
|
||||
- **Use a less resource-intensive model:** Some models are faster but less capable than others.
|
||||
|
||||
If you are still having trouble, please [open an issue](https://github.com/Owlibou/owlen/issues) on our GitHub repository.
|
||||
Reference in New Issue
Block a user