Multi-LLM Provider Support: - Add llm-core crate with LlmProvider trait abstraction - Implement Anthropic Claude API client with streaming - Implement OpenAI API client with streaming - Add token counting with SimpleTokenCounter and ClaudeTokenCounter - Add retry logic with exponential backoff and jitter Borderless TUI Redesign: - Rewrite theme system with terminal capability detection (Full/Unicode256/Basic) - Add provider tabs component with keybind switching [1]/[2]/[3] - Implement vim-modal input (Normal/Insert/Visual/Command modes) - Redesign chat panel with timestamps and streaming indicators - Add multi-provider status bar with cost tracking - Add Nerd Font icons with graceful ASCII fallbacks - Add syntax highlighting (syntect) and markdown rendering (pulldown-cmark) Advanced Agent Features: - Add system prompt builder with configurable components - Enhance subagent orchestration with parallel execution - Add git integration module for safe command detection - Add streaming tool results via channels - Expand tool set: AskUserQuestion, TodoWrite, LS, MultiEdit, BashOutput, KillShell - Add WebSearch with provider abstraction Plugin System Enhancement: - Add full agent definition parsing from YAML frontmatter - Add skill system with progressive disclosure - Wire plugin hooks into HookManager 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com>
35 lines
1.1 KiB
Rust
35 lines
1.1 KiB
Rust
use assert_cmd::Command;
|
|
use httpmock::prelude::*;
|
|
use predicates::prelude::PredicateBooleanExt;
|
|
|
|
#[tokio::test]
|
|
async fn headless_streams_ndjson() {
|
|
let server = MockServer::start_async().await;
|
|
|
|
let response = concat!(
|
|
r#"{"message":{"role":"assistant","content":"Hel"}}"#,"\n",
|
|
r#"{"message":{"role":"assistant","content":"lo"}}"#,"\n",
|
|
r#"{"done":true}"#,"\n",
|
|
);
|
|
|
|
// The CLI includes tools in the request, so we need to match any request to /api/chat
|
|
// instead of matching exact body (which includes tool definitions)
|
|
let _m = server.mock(|when, then| {
|
|
when.method(POST)
|
|
.path("/api/chat");
|
|
then.status(200)
|
|
.header("content-type", "application/x-ndjson")
|
|
.body(response);
|
|
});
|
|
|
|
let mut cmd = Command::new(assert_cmd::cargo::cargo_bin!("owlen"));
|
|
cmd.arg("--ollama-url").arg(server.base_url())
|
|
.arg("--model").arg("qwen2.5")
|
|
.arg("--print")
|
|
.arg("hello");
|
|
|
|
cmd.assert()
|
|
.success()
|
|
.stdout(predicates::str::contains("Hello").count(1).or(predicates::str::contains("Hel").and(predicates::str::contains("lo"))));
|
|
}
|