Set up Cargo workspace with initial crates: - cli: main application entry point with chat streaming tests - config: configuration management - llm/ollama: Ollama client integration with NDJSON support Includes .gitignore for Rust and JetBrains IDEs. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com>
13 lines
500 B
Rust
13 lines
500 B
Rust
use llm_ollama::{OllamaClient, OllamaOptions};
|
|
|
|
// This test stubs NDJSON by spinning a tiny local server is overkill for M0.
|
|
// Instead, test the line parser indirectly by mocking reqwest is complex.
|
|
// We'll smoke-test the client type compiles and leave end-to-end to cli tests.
|
|
|
|
#[tokio::test]
|
|
async fn client_compiles_smoke() {
|
|
let _ = OllamaClient::new("http://localhost:11434");
|
|
let _ = OllamaClient::with_cloud();
|
|
let _ = OllamaOptions { model: "qwen2.5".into(), stream: true };
|
|
}
|