feat(workspace): initialize Rust workspace structure for v2

Set up Cargo workspace with initial crates:
- cli: main application entry point with chat streaming tests
- config: configuration management
- llm/ollama: Ollama client integration with NDJSON support

Includes .gitignore for Rust and JetBrains IDEs.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
This commit is contained in:
2025-11-01 16:30:09 +01:00
parent 491fd049b0
commit 2a651ebd7b
15 changed files with 513 additions and 0 deletions

View File

@@ -0,0 +1,12 @@
use llm_ollama::{OllamaClient, OllamaOptions};
// This test stubs NDJSON by spinning a tiny local server is overkill for M0.
// Instead, test the line parser indirectly by mocking reqwest is complex.
// We'll smoke-test the client type compiles and leave end-to-end to cli tests.
#[tokio::test]
async fn client_compiles_smoke() {
let _ = OllamaClient::new("http://localhost:11434");
let _ = OllamaClient::with_cloud();
let _ = OllamaOptions { model: "qwen2.5".into(), stream: true };
}