feat(config): separate Ollama into local/cloud providers, add OpenAI & Anthropic defaults, bump schema version to 1.6.0
This commit is contained in:
@@ -113,45 +113,67 @@ These settings control the behavior of the text input area.
|
||||
|
||||
## Provider Settings (`[providers]`)
|
||||
|
||||
This section contains a table for each provider you want to configure. Owlen ships with two entries pre-populated: `ollama` for a local daemon and `ollama-cloud` for the hosted API. You can switch between them by changing `general.default_provider`.
|
||||
This section contains a table for each provider you want to configure. Owlen now ships with four entries pre-populated—`ollama_local`, `ollama_cloud`, `openai`, and `anthropic`. Switch between them by updating `general.default_provider`.
|
||||
|
||||
```toml
|
||||
[providers.ollama]
|
||||
[providers.ollama_local]
|
||||
enabled = true
|
||||
provider_type = "ollama"
|
||||
base_url = "http://localhost:11434"
|
||||
# api_key = "..."
|
||||
|
||||
[providers.ollama-cloud]
|
||||
provider_type = "ollama-cloud"
|
||||
[providers.ollama_cloud]
|
||||
enabled = false
|
||||
provider_type = "ollama_cloud"
|
||||
base_url = "https://ollama.com"
|
||||
# api_key = "${OLLAMA_API_KEY}"
|
||||
api_key_env = "OLLAMA_CLOUD_API_KEY"
|
||||
|
||||
[providers.openai]
|
||||
enabled = false
|
||||
provider_type = "openai"
|
||||
base_url = "https://api.openai.com/v1"
|
||||
api_key_env = "OPENAI_API_KEY"
|
||||
|
||||
[providers.anthropic]
|
||||
enabled = false
|
||||
provider_type = "anthropic"
|
||||
base_url = "https://api.anthropic.com/v1"
|
||||
api_key_env = "ANTHROPIC_API_KEY"
|
||||
```
|
||||
|
||||
- `enabled` (boolean, default: `true`)
|
||||
Whether the provider should be considered when refreshing models or issuing requests.
|
||||
|
||||
- `provider_type` (string, required)
|
||||
The type of the provider. The built-in options are `"ollama"` (local daemon) and `"ollama-cloud"` (hosted service).
|
||||
Identifies which implementation to use. Local Ollama instances use `"ollama"`; the hosted service uses `"ollama_cloud"`. Third-party providers use their own identifiers (`"openai"`, `"anthropic"`, ...).
|
||||
|
||||
- `base_url` (string, optional)
|
||||
The base URL of the provider's API.
|
||||
|
||||
- `api_key` (string, optional)
|
||||
The API key to use for authentication, if required.
|
||||
**Note:** `ollama-cloud` now requires an API key; Owlen will refuse to start the provider without one and will hint at the missing configuration.
|
||||
- `api_key` / `api_key_env` (string, optional)
|
||||
Authentication material. Prefer `api_key_env` to reference an environment variable so secrets remain outside of the config file.
|
||||
|
||||
- `extra` (table, optional)
|
||||
Any additional, provider-specific parameters can be added here.
|
||||
|
||||
### Using Ollama Cloud
|
||||
|
||||
Owlen now ships a single unified `ollama` provider. When an API key is present, Owlen automatically routes traffic to [Ollama Cloud](https://docs.ollama.com/cloud); otherwise it talks to the local daemon. A minimal configuration looks like this:
|
||||
Owlen now separates the local daemon and the hosted API into two providers. Enable `ollama_cloud` once you have credentials, while keeping `ollama_local` available for on-device workloads. A minimal configuration looks like this:
|
||||
|
||||
```toml
|
||||
[providers.ollama]
|
||||
provider_type = "ollama"
|
||||
base_url = "http://localhost:11434" # ignored once an API key is supplied
|
||||
api_key = "${OLLAMA_API_KEY}"
|
||||
[general]
|
||||
default_provider = "ollama_local"
|
||||
|
||||
[providers.ollama_local]
|
||||
enabled = true
|
||||
base_url = "http://localhost:11434"
|
||||
|
||||
[providers.ollama_cloud]
|
||||
enabled = true
|
||||
base_url = "https://ollama.com"
|
||||
api_key_env = "OLLAMA_CLOUD_API_KEY"
|
||||
```
|
||||
|
||||
Requests target the same `/api/chat` endpoint documented by Ollama and automatically include the API key using a `Bearer` authorization header. If you prefer not to store the key in the config file, you can leave `api_key` unset and provide it via the `OLLAMA_API_KEY` (or `OLLAMA_CLOUD_API_KEY`) environment variable instead. You can also reference an environment variable inline (for example `api_key = "$OLLAMA_API_KEY"` or `api_key = "${OLLAMA_API_KEY}"`), which Owlen expands when the configuration is loaded. The base URL is normalised automatically—Owlen enforces HTTPS, trims trailing slashes, and accepts both `https://ollama.com` and `https://api.ollama.com` without rewriting the host.
|
||||
Requests target the same `/api/chat` endpoint documented by Ollama and automatically include the API key using a `Bearer` authorization header. If you prefer not to store the key in the config file, either rely on `api_key_env` or export the environment variable manually. Owlen normalises the base URL automatically—it enforces HTTPS, trims trailing slashes, and accepts both `https://ollama.com` and `https://api.ollama.com` without rewriting the host.
|
||||
|
||||
> **Tip:** If the official `ollama signin` flow fails on Linux v0.12.3, follow the [Linux Ollama sign-in workaround](#linux-ollama-sign-in-workaround-v0123) in the troubleshooting guide to copy keys from a working machine or register them manually.
|
||||
|
||||
@@ -166,4 +188,4 @@ owlen cloud models # List the hosted models your account can access
|
||||
owlen cloud logout # Forget the stored API key
|
||||
```
|
||||
|
||||
When `privacy.encrypt_local_data = true`, the API key is written to Owlen's encrypted credential vault instead of being persisted in plaintext. Subsequent invocations automatically load the key into the runtime environment so that the config file can remain redacted. If encryption is disabled, the key is stored under `[providers.ollama-cloud].api_key` as before.
|
||||
When `privacy.encrypt_local_data = true`, the API key is written to Owlen's encrypted credential vault instead of being persisted in plaintext. Subsequent invocations automatically load the key into the runtime environment so that the config file can remain redacted. If encryption is disabled, the key is stored under `[providers.ollama_cloud].api_key` as before.
|
||||
|
||||
@@ -82,17 +82,19 @@ Ensure your provider configuration is correct. For Ollama:
|
||||
|
||||
```toml
|
||||
[general]
|
||||
default_provider = "ollama"
|
||||
default_provider = "ollama_local"
|
||||
default_model = "llama3.2:latest" # or your preferred model
|
||||
|
||||
[providers.ollama]
|
||||
[providers.ollama_local]
|
||||
enabled = true
|
||||
provider_type = "ollama"
|
||||
base_url = "http://localhost:11434"
|
||||
|
||||
[providers.ollama-cloud]
|
||||
provider_type = "ollama-cloud"
|
||||
[providers.ollama_cloud]
|
||||
enabled = true # set to false if you do not use the hosted API
|
||||
provider_type = "ollama_cloud"
|
||||
base_url = "https://ollama.com"
|
||||
api_key = "$OLLAMA_API_KEY" # Optional: for Ollama Cloud
|
||||
api_key_env = "OLLAMA_CLOUD_API_KEY"
|
||||
```
|
||||
|
||||
#### Step 3: Understanding MCP Server Configuration
|
||||
@@ -156,7 +158,7 @@ After updating your config:
|
||||
**Solution**:
|
||||
- Verify model availability on https://ollama.com/models
|
||||
- Remove the `-cloud` suffix from model names when using cloud provider
|
||||
- Ensure `api_key` is set in `[providers.ollama-cloud]` config
|
||||
- Ensure `api_key`/`api_key_env` is set in `[providers.ollama_cloud]` config
|
||||
|
||||
### 0.1.9 – Explicit Ollama Modes & Cloud Endpoint Storage
|
||||
|
||||
|
||||
@@ -127,7 +127,7 @@ Create or edit `~/.config/owlen/config.toml`:
|
||||
|
||||
```toml
|
||||
[general]
|
||||
default_provider = "ollama"
|
||||
default_provider = "ollama_local"
|
||||
default_model = "llama3.2:latest"
|
||||
|
||||
[modes.chat]
|
||||
|
||||
Reference in New Issue
Block a user