feat(config): separate Ollama into local/cloud providers, add OpenAI & Anthropic defaults, bump schema version to 1.6.0
This commit is contained in:
@@ -82,17 +82,19 @@ Ensure your provider configuration is correct. For Ollama:
|
||||
|
||||
```toml
|
||||
[general]
|
||||
default_provider = "ollama"
|
||||
default_provider = "ollama_local"
|
||||
default_model = "llama3.2:latest" # or your preferred model
|
||||
|
||||
[providers.ollama]
|
||||
[providers.ollama_local]
|
||||
enabled = true
|
||||
provider_type = "ollama"
|
||||
base_url = "http://localhost:11434"
|
||||
|
||||
[providers.ollama-cloud]
|
||||
provider_type = "ollama-cloud"
|
||||
[providers.ollama_cloud]
|
||||
enabled = true # set to false if you do not use the hosted API
|
||||
provider_type = "ollama_cloud"
|
||||
base_url = "https://ollama.com"
|
||||
api_key = "$OLLAMA_API_KEY" # Optional: for Ollama Cloud
|
||||
api_key_env = "OLLAMA_CLOUD_API_KEY"
|
||||
```
|
||||
|
||||
#### Step 3: Understanding MCP Server Configuration
|
||||
@@ -156,7 +158,7 @@ After updating your config:
|
||||
**Solution**:
|
||||
- Verify model availability on https://ollama.com/models
|
||||
- Remove the `-cloud` suffix from model names when using cloud provider
|
||||
- Ensure `api_key` is set in `[providers.ollama-cloud]` config
|
||||
- Ensure `api_key`/`api_key_env` is set in `[providers.ollama_cloud]` config
|
||||
|
||||
### 0.1.9 – Explicit Ollama Modes & Cloud Endpoint Storage
|
||||
|
||||
|
||||
Reference in New Issue
Block a user