feat(phase9): implement WebSocket transport and failover system

Implements Phase 9: Remoting / Cloud Hybrid Deployment with complete
WebSocket transport support and comprehensive failover mechanisms.

**WebSocket Transport (remote_client.rs):**
- Added WebSocket support to RemoteMcpClient using tokio-tungstenite
- Full bidirectional JSON-RPC communication over WebSocket
- Connection establishment with error handling
- Text/binary message support with proper encoding
- Connection closure detection and error reporting

**Failover & Redundancy (failover.rs - 323 lines):**
- ServerHealth tracking: Healthy, Degraded, Down states
- ServerEntry with priority-based selection (lower = higher priority)
- FailoverMcpClient implementing McpClient trait
- Automatic retry with exponential backoff
- Circuit breaker pattern (5 consecutive failures triggers Down state)
- Background health checking with configurable intervals
- Graceful failover through server priority list

**Configuration:**
- FailoverConfig with tunable parameters:
  - max_retries: 3 (default)
  - base_retry_delay: 100ms with exponential backoff
  - health_check_interval: 30s
  - circuit_breaker_threshold: 5 failures

**Testing (phase9_remoting.rs - 9 tests, all passing):**
- Priority-based server selection
- Automatic failover to backup servers
- Retry mechanism with exponential backoff
- Health status tracking and transitions
- Background health checking
- Circuit breaker behavior
- Error handling for edge cases

**Dependencies:**
- tokio-tungstenite 0.21
- tungstenite 0.21

All tests pass successfully. Phase 9 specification fully implemented.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
This commit is contained in:
2025-10-10 20:43:21 +02:00
parent 4c066bf2da
commit cdf95002fc
8 changed files with 1090 additions and 64 deletions

View File

@@ -0,0 +1,102 @@
use super::{Tool, ToolResult};
use crate::Result;
use anyhow::Context;
use async_trait::async_trait;
use serde_json::{json, Value};
/// Tool that fetches the raw HTML content for a list of URLs.
///
/// Input schema expects:
/// urls: array of strings (max 5 URLs)
/// timeout_secs: optional integer perrequest timeout (default 10)
pub struct WebScrapeTool {
// No special dependencies; uses reqwest_011 for compatibility with existing web_search.
client: reqwest_011::Client,
}
impl Default for WebScrapeTool {
fn default() -> Self {
Self::new()
}
}
impl WebScrapeTool {
pub fn new() -> Self {
let client = reqwest_011::Client::builder()
.user_agent("OwlenWebScrape/0.1")
.build()
.expect("Failed to build reqwest client");
Self { client }
}
}
#[async_trait]
impl Tool for WebScrapeTool {
fn name(&self) -> &'static str {
"web_scrape"
}
fn description(&self) -> &'static str {
"Fetch raw HTML content for a list of URLs"
}
fn schema(&self) -> Value {
json!({
"type": "object",
"properties": {
"urls": {
"type": "array",
"items": { "type": "string", "format": "uri" },
"minItems": 1,
"maxItems": 5,
"description": "List of URLs to scrape"
},
"timeout_secs": {
"type": "integer",
"minimum": 1,
"maximum": 30,
"default": 10,
"description": "Perrequest timeout in seconds"
}
},
"required": ["urls"],
"additionalProperties": false
})
}
fn requires_network(&self) -> bool {
true
}
async fn execute(&self, args: Value) -> Result<ToolResult> {
let urls = args
.get("urls")
.and_then(|v| v.as_array())
.context("Missing 'urls' array")?;
let timeout_secs = args
.get("timeout_secs")
.and_then(|v| v.as_u64())
.unwrap_or(10);
let mut results = Vec::new();
for url_val in urls {
let url = url_val.as_str().unwrap_or("");
let resp = self
.client
.get(url)
.timeout(std::time::Duration::from_secs(timeout_secs))
.send()
.await;
match resp {
Ok(r) => {
let text = r.text().await.unwrap_or_default();
results.push(json!({ "url": url, "content": text }));
}
Err(e) => {
results.push(json!({ "url": url, "error": e.to_string() }));
}
}
}
Ok(ToolResult::success(json!({ "pages": results })))
}
}