Implement comprehensive improvement roadmap (Phases 0-4)

Phase 0 - Quick fixes:
- Fix catalog entries() return type (removed extra indirection)
- Fix welcome string (mpv-mgr → empeve)
- Fix HEAD detachment on update (branch-aware fast-forward)
- Add fetch_rev with branch detection

Phase 1 - Git model ("rev means rev"):
- Add RevType enum (Commit/Tag/Branch/Default)
- Add UpdateResult enum for update outcomes
- Implement clone_with_rev for proper revision checkout
- Pinned repos (commits/tags) skip auto-update

Phase 2 - Discovery & install fidelity:
- Support init.lua and named entry points for multi-file scripts
- Better asset mapping with prefix matching for configs
- Proactive target directory creation

Phase 3 - UX and quality-of-life:
- Add --verbose flag to status command
- Add 'empeve doctor' diagnostic command
- Improve error messages with actionable hints

Phase 4 - Feature expansion:
- External TOML catalog system (extensible)
- Import --convert-local for local script management
- Lockfile support for reproducible installations

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
This commit is contained in:
2025-12-15 03:44:37 +01:00
parent 17b6bfb76d
commit 6f714e58fa
21 changed files with 1713 additions and 254 deletions

106
README.md
View File

@@ -12,8 +12,11 @@ A plugin manager for [mpv](https://mpv.io/) scripts. Manage your mpv scripts dec
- **Multi-target support** - Manage multiple mpv configs (mpv, jellyfin-mpv-shim, celluloid, etc.)
- **Smart discovery** - Automatically finds scripts, configs, fonts, and shaders in repos
- **Symlink installation** - Scripts stay in sync with upstream, easy to update
- **Browse catalog** - Discover popular mpv scripts from a curated list
- **Browse catalog** - Discover popular mpv scripts from a curated (and extensible) list
- **Per-repo targeting** - Install specific repos to specific targets
- **Pinned versions** - Pin repos to specific commits or tags for stability
- **Lockfile support** - Create reproducible installations across machines
- **Diagnostics** - Built-in doctor command to diagnose and fix issues
## Installation
@@ -60,9 +63,15 @@ empeve update
# Add from GitHub (user/repo shorthand)
empeve add tomasklaen/uosc
# Add with specific branch/tag
# Add with specific branch (tracking - auto-updates)
empeve add tomasklaen/uosc --rev main
# Add with specific tag (pinned - won't auto-update)
empeve add tomasklaen/uosc --rev v5.0.0
# Add with specific commit (pinned - won't auto-update)
empeve add somerepo/script --rev abc123def456...
# Add only specific scripts from a multi-script repo
empeve add po5/mpv_sponsorblock --scripts sponsorblock.lua
```
@@ -79,7 +88,10 @@ empeve install --force
# Install specific repo only
empeve install uosc
# Update all repos
# Install using lockfile (exact versions)
empeve install --locked
# Update all repos (skips pinned repos)
empeve update
# Update specific repo
@@ -92,6 +104,9 @@ empeve update uosc
# Show status of all repos and targets
empeve status
# Show detailed status with per-target script info
empeve status --verbose
# List installed scripts
empeve list
@@ -123,6 +138,47 @@ empeve browse subtitles
empeve browse -i
```
### Lockfile for Reproducibility
```bash
# Create lockfile with current commit SHAs
empeve lock
# Install using exact versions from lockfile
empeve install --locked
```
The lockfile (`~/.config/empeve/empeve.lock`) records the exact commit for each repo, enabling reproducible installations across machines.
### Converting Local Scripts
```bash
# Convert unmanaged local scripts to git-managed repos
empeve import --convert-local
# Convert a specific script by name
empeve import --convert-local --script my-script
```
This creates a local git repository for your scripts, enabling version control and empeve management.
### Diagnostics
```bash
# Run diagnostic checks
empeve doctor
# Auto-fix issues where possible
empeve doctor --fix
```
The doctor command checks:
- Directory permissions
- Symlink support
- Repository health
- Target configuration
- Orphaned assets
### Multi-Target Support
empeve can manage multiple mpv configurations simultaneously:
@@ -177,11 +233,40 @@ scripts = ["playlistmanager.lua"] # Only this script
| Option | Description |
|--------|-------------|
| `repo` | Repository identifier (`user/repo` or full git URL) |
| `rev` | Branch, tag, or commit to checkout |
| `rev` | Branch (tracking), tag, or commit (pinned) |
| `scripts` | Only install specific scripts from the repo |
| `targets` | Only install to specific targets (default: all) |
| `rename` | Rename the script when installing |
| `disabled` | Disable without removing from config |
| `local` | Mark as local-only repo (for converted scripts) |
### Revision Types
empeve automatically detects the type of revision:
| Pattern | Type | Behavior |
|---------|------|----------|
| Branch name (e.g., `main`) | Tracking | Updates automatically |
| Tag (e.g., `v1.2.3`) | Pinned | Stays at version |
| Commit SHA (40 chars) | Pinned | Stays at commit |
| (none) | Tracking | Follows default branch |
### External Catalogs
You can add custom script catalogs by creating TOML files in `~/.config/empeve/catalogs/`:
```toml
# ~/.config/empeve/catalogs/my-scripts.toml
[meta]
name = "My Custom Catalog"
version = "1.0.0"
[[entries]]
repo = "myuser/my-script"
name = "my-script"
description = "My awesome mpv script"
category = "utility"
```
## Commands Reference
@@ -190,12 +275,14 @@ scripts = ["playlistmanager.lua"] # Only this script
| `add <repo>` | Add a repository to config |
| `remove <repo>` | Remove a repository from config |
| `install` | Clone repos and install scripts |
| `update` | Update all repositories |
| `update` | Update all repositories (skips pinned) |
| `clean` | Remove orphaned scripts and repos |
| `status` | Show status of repos and targets |
| `list` | List installed scripts |
| `browse` | Browse popular mpv scripts |
| `import` | Import existing scripts (coming soon) |
| `import` | Import existing scripts |
| `doctor` | Diagnose and fix setup issues |
| `lock` | Create lockfile with current commits |
### Global Options
@@ -215,9 +302,14 @@ scripts = ["playlistmanager.lua"] # Only this script
```
~/.config/empeve/
├── config.toml
├── empeve.lock # Lockfile (optional)
├── catalogs/ # Custom catalogs
│ └── my-scripts.toml
└── repos/
├── tomasklaen_uosc/
── po5_mpv_sponsorblock/
── po5_mpv_sponsorblock/
└── local/ # Converted local scripts
└── my-script/
~/.config/mpv/scripts/
├── uosc -> ~/.config/empeve/repos/tomasklaen_uosc/src/uosc/

View File

@@ -1,153 +0,0 @@
/// Curated catalog of popular mpv scripts
/// These are well-maintained, commonly used scripts from the mpv community
#[derive(Debug, Clone)]
pub struct CatalogEntry {
pub repo: &'static str,
pub name: &'static str,
pub description: &'static str,
pub category: Category,
}
#[derive(Debug, Clone, Copy, PartialEq, Eq)]
pub enum Category {
Ui,
Playback,
Subtitles,
Media,
Utility,
}
impl Category {
pub fn as_str(&self) -> &'static str {
match self {
Category::Ui => "UI/OSC",
Category::Playback => "Playback",
Category::Subtitles => "Subtitles",
Category::Media => "Media",
Category::Utility => "Utility",
}
}
pub fn all() -> &'static [Category] {
&[
Category::Ui,
Category::Playback,
Category::Subtitles,
Category::Media,
Category::Utility,
]
}
}
/// Get all catalog entries
pub fn entries() -> &'static [CatalogEntry] {
&CATALOG
}
/// Get entries by category
pub fn entries_by_category(category: Category) -> Vec<&'static CatalogEntry> {
CATALOG.iter().filter(|e| e.category == category).collect()
}
static CATALOG: &[CatalogEntry] = &[
// === UI/OSC ===
CatalogEntry {
repo: "tomasklaen/uosc",
name: "uosc",
description: "Feature-rich minimalist proximity-based UI replacement",
category: Category::Ui,
},
CatalogEntry {
repo: "cyl0/ModernX",
name: "ModernX",
description: "Modern OSC replacement with streaming info support",
category: Category::Ui,
},
CatalogEntry {
repo: "po5/thumbfast",
name: "thumbfast",
description: "High-performance on-the-fly thumbnails (works with uosc/ModernX)",
category: Category::Ui,
},
CatalogEntry {
repo: "christoph-heinrich/mpv-quality-menu",
name: "quality-menu",
description: "Menu for streaming quality selection (YouTube, etc.)",
category: Category::Ui,
},
// === Playback ===
CatalogEntry {
repo: "mpv-player/mpv",
name: "autoload",
description: "Auto-load playlist entries from current directory",
category: Category::Playback,
},
CatalogEntry {
repo: "jonniek/mpv-playlistmanager",
name: "playlistmanager",
description: "Visual playlist manager with file browser",
category: Category::Playback,
},
CatalogEntry {
repo: "po5/trackselect",
name: "trackselect",
description: "Intelligent track selection based on preferences",
category: Category::Playback,
},
CatalogEntry {
repo: "po5/mpv_sponsorblock",
name: "sponsorblock",
description: "Skip YouTube sponsor segments automatically (requires Python 3)",
category: Category::Playback,
},
// === Subtitles ===
CatalogEntry {
repo: "davidde/mpv-autosub",
name: "autosub",
description: "Auto-download subtitles using subliminal",
category: Category::Subtitles,
},
CatalogEntry {
repo: "kelciour/mpv-scripts",
name: "sub-search",
description: "Search and download subtitles interactively",
category: Category::Subtitles,
},
// === Media ===
CatalogEntry {
repo: "ekisu/mpv-webm",
name: "webm",
description: "Create WebM/GIF clips from within mpv",
category: Category::Media,
},
CatalogEntry {
repo: "TheAMM/mpv_thumbnail_script",
name: "thumbnail-script",
description: "Show preview thumbnails on the seekbar",
category: Category::Media,
},
// === Utility ===
CatalogEntry {
repo: "occivink/mpv-scripts",
name: "crop/encode/seek-to",
description: "Collection: crop, encode, seek-to, and more",
category: Category::Utility,
},
CatalogEntry {
repo: "Eisa01/mpv-scripts",
name: "SmartHistory/UndoRedo",
description: "Collection: smart history, undo/redo, clipboard",
category: Category::Utility,
},
CatalogEntry {
repo: "https://somegit.dev/anonfunc/mpv-scripts",
name: "anonfunc-scripts",
description: "Collection of mpv utility scripts",
category: Category::Utility,
},
];

99
src/catalog/bundled.toml Normal file
View File

@@ -0,0 +1,99 @@
[meta]
name = "Official empeve Catalog"
version = "1.0.0"
description = "Curated list of popular mpv scripts"
# === UI/OSC ===
[[entries]]
repo = "tomasklaen/uosc"
name = "uosc"
description = "Feature-rich minimalist proximity-based UI replacement"
category = "ui"
[[entries]]
repo = "cyl0/ModernX"
name = "ModernX"
description = "Modern OSC replacement with streaming info support"
category = "ui"
[[entries]]
repo = "po5/thumbfast"
name = "thumbfast"
description = "High-performance on-the-fly thumbnails (works with uosc/ModernX)"
category = "ui"
[[entries]]
repo = "christoph-heinrich/mpv-quality-menu"
name = "quality-menu"
description = "Menu for streaming quality selection (YouTube, etc.)"
category = "ui"
# === Playback ===
[[entries]]
repo = "mpv-player/mpv"
name = "autoload"
description = "Auto-load playlist entries from current directory"
category = "playback"
[[entries]]
repo = "jonniek/mpv-playlistmanager"
name = "playlistmanager"
description = "Visual playlist manager with file browser"
category = "playback"
[[entries]]
repo = "po5/trackselect"
name = "trackselect"
description = "Intelligent track selection based on preferences"
category = "playback"
[[entries]]
repo = "po5/mpv_sponsorblock"
name = "sponsorblock"
description = "Skip YouTube sponsor segments automatically (requires Python 3)"
category = "playback"
# === Subtitles ===
[[entries]]
repo = "davidde/mpv-autosub"
name = "autosub"
description = "Auto-download subtitles using subliminal"
category = "subtitles"
[[entries]]
repo = "kelciour/mpv-scripts"
name = "sub-search"
description = "Search and download subtitles interactively"
category = "subtitles"
# === Media ===
[[entries]]
repo = "ekisu/mpv-webm"
name = "webm"
description = "Create WebM/GIF clips from within mpv"
category = "media"
[[entries]]
repo = "TheAMM/mpv_thumbnail_script"
name = "thumbnail-script"
description = "Show preview thumbnails on the seekbar"
category = "media"
# === Utility ===
[[entries]]
repo = "occivink/mpv-scripts"
name = "crop/encode/seek-to"
description = "Collection: crop, encode, seek-to, and more"
category = "utility"
[[entries]]
repo = "Eisa01/mpv-scripts"
name = "SmartHistory/UndoRedo"
description = "Collection: smart history, undo/redo, clipboard"
category = "utility"
[[entries]]
repo = "https://somegit.dev/anonfunc/mpv-scripts"
name = "anonfunc-scripts"
description = "Collection of mpv utility scripts"
category = "utility"

192
src/catalog/mod.rs Normal file
View File

@@ -0,0 +1,192 @@
//! Catalog system for curated mpv scripts
//!
//! Supports both bundled catalogs and external TOML files
use serde::{Deserialize, Serialize};
use std::path::Path;
use crate::error::Result;
/// A catalog entry representing a script repository
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct CatalogEntry {
pub repo: String,
pub name: String,
pub description: String,
pub category: String,
}
/// Metadata about a catalog file
#[derive(Debug, Clone, Serialize, Deserialize, Default)]
pub struct CatalogMeta {
pub name: Option<String>,
pub version: Option<String>,
pub description: Option<String>,
}
/// A complete catalog file structure
#[derive(Debug, Clone, Serialize, Deserialize, Default)]
pub struct CatalogFile {
#[serde(default)]
pub meta: CatalogMeta,
#[serde(default)]
pub entries: Vec<CatalogEntry>,
}
impl CatalogFile {
/// Load a catalog from a TOML file
pub fn load(path: &Path) -> Result<Self> {
let content = std::fs::read_to_string(path)?;
let catalog: CatalogFile = toml::from_str(&content)?;
Ok(catalog)
}
/// Parse a catalog from a TOML string
pub fn from_str(content: &str) -> Result<Self> {
let catalog: CatalogFile = toml::from_str(content)?;
Ok(catalog)
}
}
/// Bundled default catalog (embedded at compile time)
const BUNDLED_CATALOG: &str = include_str!("bundled.toml");
/// Category constants for backwards compatibility
#[derive(Debug, Clone, Copy, PartialEq, Eq)]
pub enum Category {
Ui,
Playback,
Subtitles,
Media,
Utility,
}
impl Category {
pub fn as_str(&self) -> &'static str {
match self {
Category::Ui => "UI/OSC",
Category::Playback => "Playback",
Category::Subtitles => "Subtitles",
Category::Media => "Media",
Category::Utility => "Utility",
}
}
pub fn display_name(&self) -> &'static str {
match self {
Category::Ui => "UI/OSC",
Category::Playback => "Playback",
Category::Subtitles => "Subtitles",
Category::Media => "Media",
Category::Utility => "Utility",
}
}
pub fn all() -> &'static [Category] {
&[
Category::Ui,
Category::Playback,
Category::Subtitles,
Category::Media,
Category::Utility,
]
}
/// Match a category string to a Category enum (case-insensitive)
pub fn from_str(s: &str) -> Option<Self> {
match s.to_lowercase().as_str() {
"ui" | "ui/osc" => Some(Category::Ui),
"playback" => Some(Category::Playback),
"subtitles" => Some(Category::Subtitles),
"media" => Some(Category::Media),
"utility" => Some(Category::Utility),
_ => None,
}
}
}
/// Catalog manager that loads and merges catalogs from multiple sources
pub struct CatalogManager {
entries: Vec<CatalogEntry>,
}
impl CatalogManager {
/// Load catalogs from bundled and external sources
pub fn load(catalogs_dir: Option<&Path>) -> Result<Self> {
let mut entries = Vec::new();
// 1. Load bundled catalog
if let Ok(bundled) = CatalogFile::from_str(BUNDLED_CATALOG) {
entries.extend(bundled.entries);
}
// 2. Load external catalogs from catalogs directory
if let Some(dir) = catalogs_dir {
if dir.exists() {
if let Ok(read_dir) = std::fs::read_dir(dir) {
for entry in read_dir.filter_map(|e| e.ok()) {
let path = entry.path();
if path.extension().map(|e| e == "toml").unwrap_or(false) {
if let Ok(catalog) = CatalogFile::load(&path) {
// Later catalogs can override earlier ones by repo key
for new_entry in catalog.entries {
// Remove existing entry with same repo if any
entries.retain(|e| e.repo != new_entry.repo);
entries.push(new_entry);
}
}
}
}
}
}
}
Ok(Self { entries })
}
/// Load only the bundled catalog (for backwards compatibility)
pub fn bundled() -> Self {
let entries = CatalogFile::from_str(BUNDLED_CATALOG)
.map(|c| c.entries)
.unwrap_or_default();
Self { entries }
}
/// Get all catalog entries
pub fn entries(&self) -> &[CatalogEntry] {
&self.entries
}
/// Get entries filtered by category
pub fn entries_by_category(&self, category: &str) -> Vec<&CatalogEntry> {
self.entries
.iter()
.filter(|e| e.category.to_lowercase() == category.to_lowercase())
.collect()
}
/// Get all unique categories
pub fn categories(&self) -> Vec<String> {
let mut cats: Vec<String> = self.entries
.iter()
.map(|e| e.category.clone())
.collect();
cats.sort();
cats.dedup();
cats
}
}
// Backwards compatibility functions
pub fn entries() -> Vec<CatalogEntry> {
CatalogManager::bundled().entries
}
pub fn entries_by_category(category: Category) -> Vec<CatalogEntry> {
let manager = CatalogManager::bundled();
manager
.entries_by_category(category.as_str())
.into_iter()
.cloned()
.collect()
}

View File

@@ -1,7 +1,7 @@
use colored::Colorize;
use std::io::{self, Write};
use crate::catalog::{self, CatalogEntry, Category};
use crate::catalog::{CatalogEntry, CatalogManager, Category};
use crate::config::{Config, RepoEntry};
use crate::error::Result;
use crate::paths::Paths;
@@ -11,6 +11,9 @@ pub fn execute(category_filter: Option<String>, interactive: bool) -> Result<()>
let paths = Paths::new()?;
let mut config = Config::load_or_default(&paths.config_file);
// Load catalog from bundled and external sources
let catalog = CatalogManager::load(Some(&paths.catalogs_dir))?;
// Get already configured repos for marking
let configured_repos: Vec<String> = config.repos.iter().map(|r| r.repo.clone()).collect();
@@ -49,7 +52,7 @@ pub fn execute(category_filter: Option<String>, interactive: bool) -> Result<()>
let mut index = 1;
for category in &categories {
let entries = catalog::entries_by_category(*category);
let entries = catalog.entries_by_category(category.as_str());
if entries.is_empty() {
continue;
}
@@ -57,7 +60,7 @@ pub fn execute(category_filter: Option<String>, interactive: bool) -> Result<()>
println!("{}", format!("── {} ──", category.as_str()).cyan().bold());
for entry in entries {
let is_configured = is_repo_configured(&configured_repos, entry.repo);
let is_configured = is_repo_configured(&configured_repos, &entry.repo);
let status_icon = if is_configured {
"".green()
@@ -127,12 +130,12 @@ pub fn execute(category_filter: Option<String>, interactive: bool) -> Result<()>
let mut added_count = 0;
for idx in selected_indices {
if let Some((_, entry)) = indexed_entries.iter().find(|(i, _)| *i == idx) {
if is_repo_configured(&configured_repos, entry.repo) {
if is_repo_configured(&configured_repos, &entry.repo) {
println!(" {} {} (already added)", "".dimmed(), entry.name.dimmed());
continue;
}
let repo_entry = RepoEntry::new(entry.repo.to_string());
let repo_entry = RepoEntry::new(entry.repo.clone());
if config.add_repo(repo_entry).is_ok() {
println!(" {} {}", "".green(), entry.name.cyan());
added_count += 1;

334
src/commands/doctor.rs Normal file
View File

@@ -0,0 +1,334 @@
use colored::Colorize;
use std::path::Path;
use crate::config::Config;
use crate::error::Result;
use crate::paths::Paths;
use crate::repo::Repository;
/// Diagnostic check result
#[derive(Debug)]
pub struct DiagnosticResult {
pub name: String,
pub status: DiagnosticStatus,
pub message: String,
pub fix_available: bool,
}
#[derive(Debug, PartialEq)]
pub enum DiagnosticStatus {
Ok,
Warning,
Error,
}
pub fn execute(fix: bool) -> Result<()> {
let paths = Paths::new()?;
let config = Config::load(&paths.config_file)?;
println!("{}", "Running diagnostics...".bold());
println!();
let mut results = Vec::new();
// 1. Check permissions
results.extend(check_permissions(&paths, &config));
// 2. Check symlink support
results.push(check_symlink_support(&paths));
// 3. Check repo health
results.extend(check_repo_health(&paths, &config));
// 4. Check for broken targets
results.extend(check_target_health(&config));
// 5. Check for orphaned assets
results.extend(check_orphaned_assets(&paths, &config));
// Display results
let mut ok_count = 0;
let mut warning_count = 0;
let mut error_count = 0;
for result in &results {
let icon = match result.status {
DiagnosticStatus::Ok => {
ok_count += 1;
"".green()
}
DiagnosticStatus::Warning => {
warning_count += 1;
"".yellow()
}
DiagnosticStatus::Error => {
error_count += 1;
"".red()
}
};
println!("{} {}: {}", icon, result.name.bold(), result.message);
if fix && result.fix_available && result.status != DiagnosticStatus::Ok {
if let Some(fixed) = attempt_fix(&result.name, &paths, &config) {
println!(" {} {}", "".dimmed(), fixed.green());
}
}
}
println!();
println!(
"{} {} passed, {} warnings, {} errors",
"Summary:".bold(),
ok_count.to_string().green(),
warning_count.to_string().yellow(),
error_count.to_string().red()
);
Ok(())
}
fn check_permissions(paths: &Paths, config: &Config) -> Vec<DiagnosticResult> {
let mut results = Vec::new();
// Check config directory
results.push(check_dir_writable(&paths.config_dir, "Config directory"));
results.push(check_dir_writable(&paths.repos_dir, "Repos directory"));
// Check each target's directories
for target in &config.targets {
if target.enabled {
results.push(check_dir_writable(
&target.scripts_dir(),
&format!("{} scripts", target.name),
));
}
}
results
}
fn check_dir_writable(path: &Path, name: &str) -> DiagnosticResult {
if !path.exists() {
return DiagnosticResult {
name: name.to_string(),
status: DiagnosticStatus::Warning,
message: format!("Directory does not exist: {}", path.display()),
fix_available: true,
};
}
// Try to create a test file
let test_file = path.join(".empeve-doctor-test");
match std::fs::write(&test_file, "") {
Ok(_) => {
let _ = std::fs::remove_file(&test_file);
DiagnosticResult {
name: name.to_string(),
status: DiagnosticStatus::Ok,
message: "Writable".to_string(),
fix_available: false,
}
}
Err(e) => DiagnosticResult {
name: name.to_string(),
status: DiagnosticStatus::Error,
message: format!("Not writable: {}", e),
fix_available: false,
},
}
}
fn check_symlink_support(paths: &Paths) -> DiagnosticResult {
let test_source = paths.config_dir.join(".empeve-symlink-test-src");
let test_link = paths.config_dir.join(".empeve-symlink-test-link");
// Create test file
if std::fs::write(&test_source, "test").is_err() {
return DiagnosticResult {
name: "Symlink support".to_string(),
status: DiagnosticStatus::Warning,
message: "Could not test symlinks (directory not writable)".to_string(),
fix_available: false,
};
}
// Try to create symlink
#[cfg(unix)]
let symlink_result = std::os::unix::fs::symlink(&test_source, &test_link);
#[cfg(windows)]
let symlink_result = std::os::windows::fs::symlink_file(&test_source, &test_link);
let result = match symlink_result {
Ok(_) => {
let _ = std::fs::remove_file(&test_link);
DiagnosticResult {
name: "Symlink support".to_string(),
status: DiagnosticStatus::Ok,
message: "Symlinks are supported".to_string(),
fix_available: false,
}
}
Err(e) => DiagnosticResult {
name: "Symlink support".to_string(),
status: DiagnosticStatus::Warning,
message: format!("Symlinks not supported ({}). Will use file copying.", e),
fix_available: false,
},
};
let _ = std::fs::remove_file(&test_source);
result
}
fn check_repo_health(paths: &Paths, config: &Config) -> Vec<DiagnosticResult> {
let mut results = Vec::new();
for entry in &config.repos {
let repo = Repository::from_entry(entry.clone(), paths);
if !repo.is_cloned {
results.push(DiagnosticResult {
name: format!("Repo: {}", entry.repo),
status: DiagnosticStatus::Warning,
message: "Not cloned".to_string(),
fix_available: true,
});
continue;
}
// Check if it's a valid git repo
match repo.open() {
Ok(git_repo) => {
// Check for broken refs
match git_repo.head() {
Ok(_) => {
results.push(DiagnosticResult {
name: format!("Repo: {}", entry.repo),
status: DiagnosticStatus::Ok,
message: "Healthy".to_string(),
fix_available: false,
});
}
Err(e) => {
results.push(DiagnosticResult {
name: format!("Repo: {}", entry.repo),
status: DiagnosticStatus::Error,
message: format!("Broken HEAD: {}", e),
fix_available: true,
});
}
}
}
Err(e) => {
results.push(DiagnosticResult {
name: format!("Repo: {}", entry.repo),
status: DiagnosticStatus::Error,
message: format!("Invalid git repo: {}", e),
fix_available: true,
});
}
}
}
results
}
fn check_target_health(config: &Config) -> Vec<DiagnosticResult> {
let mut results = Vec::new();
for target in &config.targets {
if !target.path.exists() {
results.push(DiagnosticResult {
name: format!("Target: {}", target.name),
status: DiagnosticStatus::Error,
message: format!("Base path does not exist: {}", target.path.display()),
fix_available: true,
});
} else if !target.enabled {
results.push(DiagnosticResult {
name: format!("Target: {}", target.name),
status: DiagnosticStatus::Warning,
message: "Disabled".to_string(),
fix_available: false,
});
} else {
results.push(DiagnosticResult {
name: format!("Target: {}", target.name),
status: DiagnosticStatus::Ok,
message: "Healthy".to_string(),
fix_available: false,
});
}
}
results
}
fn check_orphaned_assets(paths: &Paths, config: &Config) -> Vec<DiagnosticResult> {
let mut results = Vec::new();
for target in config.enabled_targets() {
let scripts_dir = target.scripts_dir();
if !scripts_dir.exists() {
continue;
}
let mut orphan_count = 0;
if let Ok(entries) = std::fs::read_dir(&scripts_dir) {
for entry in entries.filter_map(|e| e.ok()) {
let path = entry.path();
// Check if it's a symlink pointing to a non-existent target
if path.is_symlink() {
if let Ok(target_path) = std::fs::read_link(&path) {
// Check if target is in repos dir and doesn't exist
if target_path.starts_with(&paths.repos_dir) && !target_path.exists() {
orphan_count += 1;
}
}
}
}
}
if orphan_count > 0 {
results.push(DiagnosticResult {
name: format!("Orphaned scripts ({})", target.name),
status: DiagnosticStatus::Warning,
message: format!("{} broken symlinks found", orphan_count),
fix_available: true,
});
}
}
results
}
fn attempt_fix(name: &str, paths: &Paths, config: &Config) -> Option<String> {
// Handle directory creation fixes
if name == "Config directory" {
if std::fs::create_dir_all(&paths.config_dir).is_ok() {
return Some("Created config directory".to_string());
}
} else if name == "Repos directory" {
if std::fs::create_dir_all(&paths.repos_dir).is_ok() {
return Some("Created repos directory".to_string());
}
} else if name.ends_with(" scripts") {
// Find the target and create its scripts directory
let target_name = name.trim_end_matches(" scripts");
if let Some(target) = config.targets.iter().find(|t| t.name == target_name) {
if target.ensure_directories().is_ok() {
return Some(format!("Created directories for {}", target_name));
}
}
} else if name.starts_with("Target: ") {
let target_name = name.trim_start_matches("Target: ");
if let Some(target) = config.targets.iter().find(|t| t.name == target_name) {
if std::fs::create_dir_all(&target.path).is_ok() {
return Some("Created target base directory".to_string());
}
}
}
None
}

View File

@@ -17,10 +17,14 @@ struct ImportableScript {
}
/// Execute the `import` command - detect and import existing scripts
pub fn execute() -> Result<()> {
pub fn execute(convert_local: bool, script_filter: Option<String>) -> Result<()> {
let paths = Paths::new()?;
let mut config = Config::load(&paths.config_file)?;
if convert_local {
return execute_convert_local(&paths, &mut config, script_filter);
}
let importable = find_importable_scripts(&paths)?;
if importable.is_empty() {
@@ -250,3 +254,159 @@ fn extract_repo_identifier(url: &str) -> String {
// For other URLs, return the whole thing
url.to_string()
}
/// Execute the convert-local command - convert local scripts to git repos
fn execute_convert_local(
paths: &Paths,
config: &mut Config,
script_filter: Option<String>,
) -> Result<()> {
use std::fs;
println!("{}", "Converting local scripts to git-managed repos".bold().underline());
println!();
let mut converted_count = 0;
let mut skipped_count = 0;
let mut _failed_count = 0;
// Find local (non-git-backed) scripts and convert them
// Clone targets first to avoid borrow conflicts
let targets: Vec<_> = config.enabled_targets().cloned().collect();
for target in &targets {
let scripts_dir = target.scripts_dir();
if !scripts_dir.exists() {
continue;
}
if let Ok(entries) = fs::read_dir(&scripts_dir) {
for entry in entries.filter_map(|e| e.ok()) {
let path = entry.path();
let name = path.file_name()
.and_then(|n| n.to_str())
.map(|n| n.trim_end_matches(".lua").trim_end_matches(".js"))
.unwrap_or("");
// Skip if filter provided and doesn't match
if let Some(ref filter) = script_filter {
if name != filter {
continue;
}
}
// Skip symlinks (already managed)
if path.is_symlink() {
skipped_count += 1;
continue;
}
// Skip if already in config
let local_id = format!("local/{}", name);
if config.repos.iter().any(|r| r.repo == local_id) {
println!(" {} {} (already managed)", "".dimmed(), name);
skipped_count += 1;
continue;
}
// Convert the script
match convert_script_to_repo(&path, name, paths, config) {
Ok(_) => {
println!(" {} {} converted to local repo", "".green(), name.cyan());
converted_count += 1;
}
Err(e) => {
println!(" {} {} failed: {}", "".red(), name, e);
_failed_count += 1;
}
}
}
}
}
if converted_count > 0 {
config.save(&paths.config_file)?;
println!();
println!(
"{} Converted {} script(s) to local repos",
"Done!".green().bold(),
converted_count.to_string().cyan()
);
println!();
println!("{}", "Run 'empeve install' to create symlinks for converted scripts.".dimmed());
} else {
println!();
println!("{}", "No scripts to convert.".yellow());
if skipped_count > 0 {
println!("Skipped {} script(s) (already managed or symlinks)", skipped_count);
}
}
Ok(())
}
/// Convert a local script to a git-managed repository
fn convert_script_to_repo(
source_path: &Path,
name: &str,
paths: &Paths,
config: &mut Config,
) -> Result<()> {
use std::fs;
// Create unique identifier for local repo
let local_repo_id = format!("local/{}", name);
let repo_path = paths.repos_dir.join(&local_repo_id);
// Create directory
fs::create_dir_all(&repo_path)?;
// Copy script files to repo directory
if source_path.is_dir() {
copy_dir_recursive(source_path, &repo_path)?;
} else {
let file_name = source_path.file_name().unwrap();
fs::copy(source_path, repo_path.join(file_name))?;
}
// Initialize git repository
let repo = GitOps::init(&repo_path)?;
// Create initial commit
GitOps::add_all_and_commit(
&repo,
&format!("Initial import of {} from local mpv scripts\n\nConverted by empeve", name),
)?;
// Add to config as local repo
let mut entry = RepoEntry::new(local_repo_id);
entry.local = true;
config.add_repo(entry)?;
// Remove original (will be symlinked back during install)
if source_path.is_dir() {
fs::remove_dir_all(source_path)?;
} else {
fs::remove_file(source_path)?;
}
Ok(())
}
/// Recursively copy a directory
fn copy_dir_recursive(src: &Path, dst: &Path) -> Result<()> {
use std::fs;
fs::create_dir_all(dst)?;
for entry in fs::read_dir(src)? {
let entry = entry?;
let src_path = entry.path();
let dst_path = dst.join(entry.file_name());
if src_path.is_dir() {
copy_dir_recursive(&src_path, &dst_path)?;
} else {
fs::copy(&src_path, &dst_path)?;
}
}
Ok(())
}

View File

@@ -2,13 +2,19 @@ use colored::Colorize;
use crate::config::{Config, TargetConfig};
use crate::error::Result;
use crate::lockfile::Lockfile;
use crate::paths::Paths;
use crate::repo::{Repository, ScriptDiscovery};
use crate::repo::{GitOps, Repository, ScriptDiscovery};
use crate::script::ScriptInstaller;
use crate::ui::create_spinner;
/// Execute the `install` command - clone repos and install scripts
pub fn execute(force: bool, repos_filter: Option<Vec<String>>, target_filter: Option<Vec<String>>) -> Result<()> {
pub fn execute(
force: bool,
repos_filter: Option<Vec<String>>,
target_filter: Option<Vec<String>>,
locked: bool,
) -> Result<()> {
let paths = Paths::new()?;
paths.ensure_directories()?;
@@ -23,6 +29,20 @@ pub fn execute(force: bool, repos_filter: Option<Vec<String>>, target_filter: Op
return Ok(());
}
// Load lockfile if --locked flag is set
let lockfile = if locked {
let loaded_lockfile = Lockfile::load(&paths.lockfile)?;
if loaded_lockfile.is_empty() {
println!("{}", "No lockfile found. Run 'empeve lock' first.".yellow());
return Ok(());
}
println!("{}", "Using lockfile for exact versions.".dimmed());
println!();
Some(loaded_lockfile)
} else {
None
};
// Check if we have any targets configured, filtered by --target flag
let targets: Vec<&TargetConfig> = config
.enabled_targets()
@@ -33,6 +53,18 @@ pub fn execute(force: bool, repos_filter: Option<Vec<String>>, target_filter: Op
})
.collect();
// Proactively ensure all target directories exist
for target in &targets {
if let Err(e) = target.ensure_directories() {
eprintln!(
"{}: Could not create directories for target '{}': {}",
"Warning".yellow(),
target.name,
e
);
}
}
if targets.is_empty() {
if target_filter.is_some() {
println!("{}", "No matching targets found.".yellow());
@@ -70,18 +102,32 @@ pub fn execute(force: bool, repos_filter: Option<Vec<String>>, target_filter: Op
// Clone if not already cloned
if !repo.is_cloned {
let spinner = create_spinner(&format!("Cloning {}...", entry.repo));
match repo.clone(config.settings.shallow_clone) {
Ok(_) => {
spinner.finish_with_message(format!("{} {} {}", "Cloned".green(), entry.repo.cyan(), "".green()));
}
Err(e) => {
spinner.finish_with_message(format!("{} {} {}", "Failed".red(), entry.repo.cyan(), "".red()));
let error_hint = format_clone_error(&e);
eprintln!(" {}", error_hint.red());
failed_repos.push((entry.repo.clone(), error_hint));
if entry.is_local() {
// Local repos should already exist in repos_dir
// Check if they exist, otherwise warn
if repo.local_path.exists() {
println!(" {} {} (local repo)", "".dimmed(), entry.repo.cyan());
} else {
println!(" {} {} (local repo missing)", "".yellow(), entry.repo);
eprintln!(" {}", "Local repository not found. It may have been deleted.".red());
failed_repos.push((entry.repo.clone(), "Local repo directory missing".to_string()));
continue;
}
} else {
// Clone from remote
let spinner = create_spinner(&format!("Cloning {}...", entry.repo));
match repo.clone(config.settings.shallow_clone) {
Ok(_) => {
spinner.finish_with_message(format!("{} {} {}", "Cloned".green(), entry.repo.cyan(), "".green()));
}
Err(e) => {
spinner.finish_with_message(format!("{} {} {}", "Failed".red(), entry.repo.cyan(), "".red()));
let error_hint = format_clone_error(&e);
eprintln!(" {}", error_hint.red());
failed_repos.push((entry.repo.clone(), error_hint));
continue;
}
}
}
} else if force {
println!(" {} {} (already cloned)", "".dimmed(), entry.repo.cyan());
@@ -89,6 +135,33 @@ pub fn execute(force: bool, repos_filter: Option<Vec<String>>, target_filter: Op
println!(" {} {} (already cloned)", "".dimmed(), entry.repo.cyan());
}
// If locked, checkout specific commit
if let Some(ref loaded_lockfile) = lockfile {
if let Some(locked_repo) = loaded_lockfile.get_locked(&entry.repo) {
if let Ok(git_repo) = repo.open() {
let current_commit = GitOps::head_commit(&git_repo).unwrap_or_default();
if current_commit != locked_repo.commit {
// Checkout the locked commit
if let Err(e) = checkout_commit(&git_repo, &locked_repo.commit) {
println!(
" {} Could not checkout locked commit for {}: {}",
"".yellow(),
entry.repo,
e
);
} else {
println!(
" {} {} checked out to locked commit {}",
"🔒".dimmed(),
entry.repo.cyan(),
&locked_repo.commit[..7.min(locked_repo.commit.len())].dimmed()
);
}
}
}
}
}
// Discover scripts
let scripts = ScriptDiscovery::discover(&repo.local_path);
@@ -127,9 +200,6 @@ pub fn execute(force: bool, repos_filter: Option<Vec<String>>, target_filter: Op
config.settings.use_symlinks,
);
// Ensure target directories exist
std::fs::create_dir_all(target.scripts_dir()).ok();
let target_label = if repo_targets.len() > 1 {
format!("[{}] ", target.name).dimmed()
} else {
@@ -248,3 +318,11 @@ fn format_clone_error(error: &crate::error::EmpveError) -> String {
format!("Clone failed: {}", error_str)
}
}
/// Checkout a specific commit in the repository
fn checkout_commit(repo: &git2::Repository, commit: &str) -> Result<()> {
let oid = git2::Oid::from_str(commit)?;
repo.set_head_detached(oid)?;
repo.checkout_head(Some(git2::build::CheckoutBuilder::default().force()))?;
Ok(())
}

86
src/commands/lock.rs Normal file
View File

@@ -0,0 +1,86 @@
use colored::Colorize;
use crate::config::Config;
use crate::error::Result;
use crate::lockfile::{Lockfile, LockedRepo};
use crate::paths::Paths;
use crate::repo::{GitOps, Repository};
/// Execute the `lock` command - create/update lockfile with current state
pub fn execute() -> Result<()> {
let paths = Paths::new()?;
let config = Config::load(&paths.config_file)?;
if config.repos.is_empty() {
println!("{}", "No repositories configured.".yellow());
return Ok(());
}
let mut lockfile = Lockfile::new();
println!("{}", "Creating lockfile...".bold());
println!();
let mut locked_count = 0;
let mut skipped_count = 0;
for entry in config.enabled_repos() {
let repo = Repository::from_entry(entry.clone(), &paths);
if !repo.is_cloned {
println!(" {} {} (not cloned, skipping)", "".dimmed(), entry.repo.dimmed());
skipped_count += 1;
continue;
}
match repo.open() {
Ok(git_repo) => {
match GitOps::head_commit(&git_repo) {
Ok(commit) => {
lockfile.lock_repo(
&entry.repo,
LockedRepo {
commit: commit.clone(),
source: entry.git_url(),
rev: entry.rev.clone(),
},
);
let short_commit = &commit[..7.min(commit.len())];
println!(" {} {} @ {}", "".green(), entry.repo.cyan(), short_commit.dimmed());
locked_count += 1;
}
Err(e) => {
println!(" {} {} (error: {})", "".red(), entry.repo, e);
skipped_count += 1;
}
}
}
Err(e) => {
println!(" {} {} (error: {})", "".red(), entry.repo, e);
skipped_count += 1;
}
}
}
if locked_count > 0 {
lockfile.save(&paths.lockfile)?;
println!();
println!(
"{} Lockfile saved to {}",
"Done!".green().bold(),
paths.lockfile.display().to_string().dimmed()
);
println!(
" {} repos locked, {} skipped",
locked_count.to_string().green(),
skipped_count.to_string().yellow()
);
} else {
println!();
println!("{}", "No repos to lock.".yellow());
}
Ok(())
}

View File

@@ -3,9 +3,11 @@ use clap::Subcommand;
pub mod add;
pub mod browse;
pub mod clean;
pub mod doctor;
pub mod import;
pub mod install;
pub mod list;
pub mod lock;
pub mod remove;
pub mod status;
pub mod update;
@@ -44,6 +46,10 @@ pub enum Commands {
/// Only install specific repos
repos: Option<Vec<String>>,
/// Use lockfile for exact versions
#[arg(long)]
locked: bool,
},
/// Update all repositories
@@ -60,7 +66,11 @@ pub enum Commands {
},
/// Show status of all repos and scripts
Status,
Status {
/// Show detailed info including installed scripts per target
#[arg(short, long)]
verbose: bool,
},
/// List all installed scripts
List {
@@ -70,7 +80,15 @@ pub enum Commands {
},
/// Import existing scripts from mpv directory
Import,
Import {
/// Convert local scripts to git-managed repos
#[arg(long)]
convert_local: bool,
/// Specific script to convert (by name)
#[arg(short, long)]
script: Option<String>,
},
/// Browse popular mpv scripts
Browse {
@@ -81,4 +99,14 @@ pub enum Commands {
#[arg(short, long)]
interactive: bool,
},
/// Diagnose common issues with empeve setup
Doctor {
/// Automatically fix issues where possible
#[arg(long)]
fix: bool,
},
/// Create lockfile with current repo commits
Lock,
}

View File

@@ -3,12 +3,15 @@ use colored::Colorize;
use crate::config::Config;
use crate::error::Result;
use crate::paths::Paths;
use crate::repo::{Repository, ScriptDiscovery};
use crate::repo::git_ops::RevType;
use crate::repo::discovery::ScriptDiscovery;
use crate::repo::Repository;
/// Execute the `status` command - show status of all repos
pub fn execute() -> Result<()> {
pub fn execute(verbose: bool) -> Result<()> {
let paths = Paths::new()?;
let config = Config::load(&paths.config_file)?;
let enabled_targets = config.enabled_targets().collect::<Vec<_>>();
// Show targets section
let targets: Vec<_> = config.targets.iter().collect();
@@ -96,7 +99,7 @@ pub fn execute() -> Result<()> {
// Count scripts
let scripts = ScriptDiscovery::discover(&repo.local_path);
let script_count = if let Some(ref filter) = entry.scripts {
ScriptDiscovery::filter_scripts(scripts, filter).len()
ScriptDiscovery::filter_scripts(scripts.clone(), filter).len()
} else {
scripts.len()
};
@@ -107,6 +110,23 @@ pub fn execute() -> Result<()> {
script_count
);
// Show pinned vs tracking status
let rev_type = RevType::from_rev(entry.rev.as_deref());
match rev_type {
RevType::Commit(ref c) => {
println!(" {} commit {}", "📌 pinned:".dimmed(), &c[..7.min(c.len())].yellow());
}
RevType::Tag(ref t) => {
println!(" {} tag {}", "📌 pinned:".dimmed(), t.yellow());
}
RevType::Branch(ref b) => {
println!(" {} branch {}", "🔄 tracking:".dimmed(), b.green());
}
RevType::Default => {
println!(" {} {}", "🔄 tracking:".dimmed(), "default branch".green());
}
}
// Show targets if in multi-target mode
if has_multiple_targets {
let repo_targets: Vec<_> = match &entry.targets {
@@ -120,11 +140,6 @@ pub fn execute() -> Result<()> {
);
}
// Show branch/rev if specified
if let Some(ref rev) = entry.rev {
println!(" {} {}", "branch:".dimmed(), rev);
}
// Show script filter if specified
if let Some(ref filter) = entry.scripts {
println!(
@@ -133,6 +148,43 @@ pub fn execute() -> Result<()> {
filter.join(", ").dimmed()
);
}
// Show installed scripts per target (verbose mode only)
if verbose && repo.is_cloned {
let scripts = ScriptDiscovery::discover(&repo.local_path);
let scripts = if let Some(ref filter) = entry.scripts {
ScriptDiscovery::filter_scripts(scripts, filter)
} else {
scripts
};
if !scripts.is_empty() {
println!(" {}:", "scripts".dimmed());
for script in &scripts {
print!(" {} {}", "-".dimmed(), script.name);
// Show which targets have this script installed
let mut installed_targets = Vec::new();
for target in &enabled_targets {
if entry.should_install_to(&target.name) {
let script_path = target.scripts_dir().join(&script.name);
let script_path_lua = target.scripts_dir().join(format!("{}.lua", &script.name));
let script_path_js = target.scripts_dir().join(format!("{}.js", &script.name));
if script_path.exists() || script_path_lua.exists() || script_path_js.exists() {
installed_targets.push(target.name.as_str());
}
}
}
if !installed_targets.is_empty() {
print!(" {}", format!("[{}]", installed_targets.join(", ")).green());
} else {
print!(" {}", "(not installed)".yellow());
}
println!();
}
}
}
}
println!();

View File

@@ -3,7 +3,7 @@ use colored::Colorize;
use crate::config::Config;
use crate::error::Result;
use crate::paths::Paths;
use crate::repo::Repository;
use crate::repo::{Repository, git_ops::{RevType, UpdateResult}};
use crate::ui::create_spinner;
/// Execute the `update` command - fetch and update all repositories
@@ -52,35 +52,50 @@ pub fn execute(repos_filter: Option<Vec<String>>) -> Result<()> {
continue;
}
// Check if there are updates
match repo.has_updates() {
Ok(true) => {
// Apply updates
match repo.update() {
Ok(new_commit) => {
spinner.finish_with_message(format!(
"{} {} {} ({}{})",
"".green(),
entry.repo.cyan(),
"updated".green(),
&before_commit[..7.min(before_commit.len())].dimmed(),
&new_commit[..7.min(new_commit.len())].green()
));
updated += 1;
}
Err(e) => {
spinner.finish_with_message(format!("{} {} {}", "".red(), entry.repo.cyan(), "update failed".red()));
eprintln!(" {}: {}", "Error".red(), e);
errors += 1;
}
}
// Check for updates
match repo.update() {
Ok(UpdateResult::Updated(new_commit)) => {
spinner.finish_with_message(format!(
"{} {} {} ({}{})",
"".green(),
entry.repo.cyan(),
"updated".green(),
&before_commit[..7.min(before_commit.len())].dimmed(),
&new_commit[..7.min(new_commit.len())].green()
));
updated += 1;
}
Ok(false) => {
spinner.finish_with_message(format!("{} {} {}", "".dimmed(), entry.repo.cyan(), "up to date".dimmed()));
Ok(UpdateResult::UpToDate) => {
spinner.finish_with_message(format!(
"{} {} {}",
"".dimmed(),
entry.repo.cyan(),
"up to date".dimmed()
));
up_to_date += 1;
}
Ok(UpdateResult::Pinned) => {
let pin_info = match repo.rev_type() {
RevType::Commit(c) => format!("commit {}", &c[..7.min(c.len())]),
RevType::Tag(t) => format!("tag {}", t),
_ => "pinned".to_string(),
};
spinner.finish_with_message(format!(
"{} {} {} ({})",
"📌".dimmed(),
entry.repo.cyan(),
"pinned".dimmed(),
pin_info.dimmed()
));
up_to_date += 1;
}
Err(e) => {
spinner.finish_with_message(format!("{} {} {}", "".red(), entry.repo.cyan(), "check failed".red()));
spinner.finish_with_message(format!(
"{} {} {}",
"".red(),
entry.repo.cyan(),
"update failed".red()
));
eprintln!(" {}: {}", "Error".red(), e);
errors += 1;
}

View File

@@ -46,6 +46,23 @@ impl TargetConfig {
pub fn shaders_dir(&self) -> PathBuf {
self.path.join("shaders")
}
/// Ensure all asset directories exist for this target
pub fn ensure_directories(&self) -> std::io::Result<()> {
std::fs::create_dir_all(self.scripts_dir())?;
std::fs::create_dir_all(self.script_opts_dir())?;
std::fs::create_dir_all(self.fonts_dir())?;
std::fs::create_dir_all(self.shaders_dir())?;
Ok(())
}
/// Check if all directories exist
pub fn directories_exist(&self) -> bool {
self.scripts_dir().exists()
&& self.script_opts_dir().exists()
&& self.fonts_dir().exists()
&& self.shaders_dir().exists()
}
}
/// Main configuration structure
@@ -122,6 +139,10 @@ pub struct RepoEntry {
/// Optional: disable this repo without removing
#[serde(default, skip_serializing_if = "is_false")]
pub disabled: bool,
/// Whether this is a local-only repo (not a remote git URL)
#[serde(default, skip_serializing_if = "is_false")]
pub local: bool,
}
fn is_false(value: &bool) -> bool {
@@ -138,6 +159,7 @@ impl RepoEntry {
scripts: None,
targets: None,
disabled: false,
local: false,
}
}
@@ -172,8 +194,18 @@ impl RepoEntry {
&self.repo
}
/// Check if this is a local-only repository
pub fn is_local(&self) -> bool {
self.local || self.repo.starts_with("local/")
}
/// Parse the repo identifier into a full git URL
pub fn git_url(&self) -> String {
// Local repos don't have URLs
if self.is_local() {
return String::new();
}
let repo = &self.repo;
if repo.starts_with("http://")
|| repo.starts_with("https://")

View File

@@ -1,4 +1,20 @@
use thiserror::Error;
use git2::{ErrorClass, ErrorCode};
/// Detailed git error with actionable hint
#[derive(Debug)]
pub struct GitDetailedError {
pub code: ErrorCode,
pub class: ErrorClass,
pub message: String,
pub hint: String,
}
impl std::fmt::Display for GitDetailedError {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
write!(f, "{}", self.message)
}
}
#[derive(Error, Debug)]
pub enum EmpveError {
@@ -30,4 +46,63 @@ pub enum EmpveError {
InvalidRepo(String),
}
impl EmpveError {
/// Get a user-friendly hint for resolving this error
pub fn hint(&self) -> Option<String> {
match self {
EmpveError::Git(e) => Some(git_error_hint(e)),
EmpveError::RepoNotFound(_) => Some("Check the repository name and try again. Use 'empeve browse' to see available scripts.".to_string()),
EmpveError::RepoExists(_) => Some("Use 'empeve remove' first if you want to replace it.".to_string()),
EmpveError::ScriptNotFound(_) => Some("Check script name or run 'empeve status' to see available scripts.".to_string()),
EmpveError::InvalidRepo(_) => Some("Use format 'user/repo' for GitHub or a full git URL.".to_string()),
_ => None,
}
}
}
/// Generate a user-friendly hint from a git2 error
pub fn git_error_hint(error: &git2::Error) -> String {
match (error.code(), error.class()) {
(ErrorCode::NotFound, ErrorClass::Reference) => {
"The specified branch or ref was not found. Check if it exists on the remote.".to_string()
}
(ErrorCode::NotFound, ErrorClass::Repository) => {
"Repository not found. Verify the URL is correct and you have access.".to_string()
}
(ErrorCode::Auth, _) => {
"Authentication failed. The repository may be private or credentials are invalid.".to_string()
}
(ErrorCode::Certificate, _) | (_, ErrorClass::Ssl) => {
"SSL/TLS certificate error. Check your system certificates or network.".to_string()
}
(ErrorCode::Locked, _) => {
"Repository is locked. Another process may be using it. Try again later.".to_string()
}
(ErrorCode::Exists, _) => {
"The destination already exists. Use --force to overwrite.".to_string()
}
(ErrorCode::BareRepo, _) => {
"Cannot perform this operation on a bare repository.".to_string()
}
(ErrorCode::UnbornBranch, _) => {
"The repository has no commits yet.".to_string()
}
(ErrorCode::Uncommitted, _) => {
"There are uncommitted changes. Commit or stash them first.".to_string()
}
(_, ErrorClass::Net) | (ErrorCode::GenericError, ErrorClass::Os) => {
"Network error. Check your internet connection.".to_string()
}
(_, ErrorClass::Checkout) => {
"Checkout failed. There may be conflicting local changes.".to_string()
}
(_, ErrorClass::FetchHead) => {
"Failed to update FETCH_HEAD. Try running 'empeve clean' and reinstalling.".to_string()
}
_ => {
format!("Git operation failed ({:?}/{:?})", error.class(), error.code())
}
}
}
pub type Result<T> = std::result::Result<T, EmpveError>;

View File

@@ -3,6 +3,7 @@ pub mod cli;
pub mod commands;
pub mod config;
pub mod error;
pub mod lockfile;
pub mod paths;
pub mod repo;
pub mod script;

81
src/lockfile.rs Normal file
View File

@@ -0,0 +1,81 @@
//! Lockfile for reproducible empeve installations
use serde::{Deserialize, Serialize};
use std::collections::HashMap;
use std::path::Path;
use crate::error::Result;
/// Lockfile recording exact state of all repositories
#[derive(Debug, Serialize, Deserialize, Default)]
pub struct Lockfile {
/// Version of lockfile format
pub version: u32,
/// Timestamp when lockfile was created (RFC 3339 format)
pub created_at: String,
/// Locked repository states
pub repos: HashMap<String, LockedRepo>,
}
/// A locked repository state
#[derive(Debug, Serialize, Deserialize, Clone)]
pub struct LockedRepo {
/// Full commit SHA
pub commit: String,
/// Original source URL (for reference)
pub source: String,
/// Branch/tag if originally specified
pub rev: Option<String>,
}
impl Lockfile {
/// Create a new empty lockfile
pub fn new() -> Self {
Self {
version: 1,
created_at: Self::current_timestamp(),
repos: HashMap::new(),
}
}
/// Load lockfile from path
pub fn load(path: &Path) -> Result<Self> {
if !path.exists() {
return Ok(Self::default());
}
let content = std::fs::read_to_string(path)?;
let lockfile: Lockfile = toml::from_str(&content)?;
Ok(lockfile)
}
/// Save lockfile to path
pub fn save(&self, path: &Path) -> Result<()> {
let content = toml::to_string_pretty(self)?;
std::fs::write(path, content)?;
Ok(())
}
/// Check if a repo is locked
pub fn get_locked(&self, identifier: &str) -> Option<&LockedRepo> {
self.repos.get(identifier)
}
/// Lock a repository at a specific commit
pub fn lock_repo(&mut self, identifier: &str, locked: LockedRepo) {
self.repos.insert(identifier.to_string(), locked);
}
/// Check if lockfile has any entries
pub fn is_empty(&self) -> bool {
self.repos.is_empty()
}
fn current_timestamp() -> String {
// Simple timestamp without external dependency
use std::time::{SystemTime, UNIX_EPOCH};
let duration = SystemTime::now()
.duration_since(UNIX_EPOCH)
.unwrap_or_default();
format!("{}", duration.as_secs())
}
}

View File

@@ -6,6 +6,12 @@ use std::io::{self, Write};
fn main() {
if let Err(error) = run() {
eprintln!("{}: {}", "error".red().bold(), error);
// Show hint if available
if let Some(hint) = error.hint() {
eprintln!("{}: {}", "hint".yellow(), hint);
}
std::process::exit(1);
}
}
@@ -23,8 +29,8 @@ fn run() -> Result<()> {
commands::Commands::Remove { repo, purge } => {
commands::remove::execute(&repo, purge)?;
}
commands::Commands::Install { force, repos } => {
commands::install::execute(force, repos, cli.target)?;
commands::Commands::Install { force, repos, locked } => {
commands::install::execute(force, repos, cli.target, locked)?;
}
commands::Commands::Update { repos } => {
commands::update::execute(repos)?;
@@ -32,18 +38,24 @@ fn run() -> Result<()> {
commands::Commands::Clean { yes } => {
commands::clean::execute(yes, cli.target)?;
}
commands::Commands::Status => {
commands::status::execute()?;
commands::Commands::Status { verbose } => {
commands::status::execute(verbose)?;
}
commands::Commands::List { detailed } => {
commands::list::execute(detailed, cli.target)?;
}
commands::Commands::Import => {
commands::import::execute()?;
commands::Commands::Import { convert_local, script } => {
commands::import::execute(convert_local, script)?;
}
commands::Commands::Browse { category, interactive } => {
commands::browse::execute(category, interactive)?;
}
commands::Commands::Doctor { fix } => {
commands::doctor::execute(fix)?;
}
commands::Commands::Lock => {
commands::lock::execute()?;
}
}
Ok(())
@@ -54,10 +66,11 @@ fn check_first_run(command: &commands::Commands) -> Result<()> {
// Only check for certain commands
let should_check = matches!(
command,
commands::Commands::Status
commands::Commands::Status { .. }
| commands::Commands::List { .. }
| commands::Commands::Install { .. }
| commands::Commands::Browse { .. }
| commands::Commands::Doctor { .. }
);
if !should_check {
@@ -87,7 +100,7 @@ fn check_first_run(command: &commands::Commands) -> Result<()> {
}
// Show welcome and detected targets
println!("{}", "Welcome to mpv-mgr!".green().bold());
println!("{}", "Welcome to empeve!".green().bold());
println!();
println!("{}", "Detected mpv configuration folders:".bold());
println!();
@@ -139,7 +152,11 @@ fn check_first_run(command: &commands::Commands) -> Result<()> {
for i in indices_to_use {
if let Some(target) = detected.get(i) {
config.add_target(target.to_target_config());
let target_config = target.to_target_config();
if let Err(e) = target_config.ensure_directories() {
eprintln!(" {}: Could not create directories for {}: {}", "Warning".yellow(), target.name, e);
}
config.add_target(target_config);
println!(" {} {}", "".green(), target.name.cyan());
}
}

View File

@@ -15,6 +15,9 @@ pub struct Paths {
/// Directory for cloned repos (~/.config/empeve/repos)
pub repos_dir: PathBuf,
/// Directory for external catalog TOML files (~/.config/empeve/catalogs)
pub catalogs_dir: PathBuf,
/// mpv scripts directory (~/.config/mpv/scripts)
pub mpv_scripts_dir: PathBuf,
@@ -26,6 +29,9 @@ pub struct Paths {
/// mpv shaders directory (~/.config/mpv/shaders)
pub mpv_shaders_dir: PathBuf,
/// Lockfile path (~/.config/empeve/empeve.lock)
pub lockfile: PathBuf,
}
impl Paths {
@@ -40,6 +46,8 @@ impl Paths {
Ok(Self {
config_file: config_dir.join("config.toml"),
repos_dir: config_dir.join("repos"),
catalogs_dir: config_dir.join("catalogs"),
lockfile: config_dir.join("empeve.lock"),
config_dir,
mpv_scripts_dir: mpv_dir.join("scripts"),
mpv_script_opts_dir: mpv_dir.join("script-opts"),

View File

@@ -60,6 +60,9 @@ impl ScriptType {
}
}
/// Entry point file names for multi-file scripts
const ENTRY_POINT_NAMES: &[&str] = &["main", "init"];
/// Locations to search for scripts in a repository
const SCRIPT_LOCATIONS: &[&str] = &[
"", // root directory
@@ -118,7 +121,7 @@ impl ScriptDiscovery {
continue;
}
// Look for script-opts/*.conf or <script_name>.conf
// Look for script-opts/*.conf with prefix matching
let script_opts_dir = search_dir.join("script-opts");
if script_opts_dir.exists() {
Self::collect_config_files(&script_opts_dir, script_name, &mut assets.script_opts);
@@ -130,6 +133,21 @@ impl ScriptDiscovery {
assets.script_opts.push(direct_conf);
}
// Check script directory itself for .conf files
let script_dir = search_dir.join(script_name);
if script_dir.is_dir() {
if let Ok(entries) = std::fs::read_dir(&script_dir) {
for entry in entries.filter_map(|e| e.ok()) {
let path = entry.path();
if path.extension().map(|e| e == "conf").unwrap_or(false) {
if !assets.script_opts.contains(&path) {
assets.script_opts.push(path);
}
}
}
}
}
// Look for fonts/
let fonts_dir = search_dir.join("fonts");
if fonts_dir.exists() {
@@ -154,8 +172,14 @@ impl ScriptDiscovery {
if let Some(ext) = path.extension().and_then(|e| e.to_str()) {
if ext == "conf" {
if let Some(name) = path.file_stem().and_then(|n| n.to_str()) {
// Include if name matches script or is a general config
if name.to_lowercase() == script_name.to_lowercase() {
let name_lower = name.to_lowercase();
let script_lower = script_name.to_lowercase();
// Include if exact match OR prefix match
if name_lower == script_lower
|| name_lower.starts_with(&format!("{}_", script_lower))
|| name_lower.starts_with(&format!("{}.", script_lower))
{
if !dest.contains(&path) {
dest.push(path);
}
@@ -204,31 +228,49 @@ impl ScriptDiscovery {
let mut scripts = Vec::new();
let mut multi_file_dirs = std::collections::HashSet::new();
// First pass: find multi-file scripts (directories with main.lua/main.js)
// First pass: find multi-file scripts (directories with entry points)
for entry in WalkDir::new(dir).max_depth(2).into_iter().filter_map(|e| e.ok()) {
let path = entry.path();
if path.is_dir() {
// Check for main.lua or main.js
let dir_name = path
.file_name()
.map(|n| n.to_string_lossy().into_owned())
.unwrap_or_default();
// Check for entry point files (main.lua, init.lua, <dir_name>.lua, etc.)
for ext in ["lua", "js"] {
let main_script = path.join(format!("main.{}", ext));
if main_script.exists() {
let mut found_entry = None;
// Check standard entry points: main, init
for entry_name in ENTRY_POINT_NAMES {
let entry_path = path.join(format!("{}.{}", entry_name, ext));
if entry_path.exists() {
found_entry = Some(entry_path);
break;
}
}
// Check for directory-named entry point
if found_entry.is_none() {
let dir_named_entry = path.join(format!("{}.{}", dir_name, ext));
if dir_named_entry.exists() {
found_entry = Some(dir_named_entry);
}
}
if let Some(_entry_path) = found_entry {
let repo_path = path
.strip_prefix(repo_root)
.unwrap_or(path)
.to_path_buf();
let name = path
.file_name()
.map(|n| n.to_string_lossy().into_owned())
.unwrap_or_default();
scripts.push(DiscoveredScript {
repo_path: repo_path.clone(),
absolute_path: path.to_path_buf(),
script_type: ScriptType::from_extension(ext).unwrap(),
is_multi_file: true,
name,
name: dir_name.clone(),
assets: ScriptAssets::default(),
});

View File

@@ -3,10 +3,95 @@ use std::path::Path;
use crate::error::Result;
/// Classification of a revision string
#[derive(Debug, Clone, PartialEq)]
pub enum RevType {
/// A specific commit hash (40-char hex) - pinned, never auto-updates
Commit(String),
/// A tag (semver pattern like v1.2.3) - pinned, never auto-updates
Tag(String),
/// A branch name - tracking, auto-updates on `update`
Branch(String),
/// No rev specified - track default branch
Default,
}
impl RevType {
/// Parse a revision string into a RevType
pub fn from_rev(rev: Option<&str>) -> Self {
match rev {
None => RevType::Default,
Some(r) => {
// 40-char hex = commit hash
if r.len() == 40 && r.chars().all(|c| c.is_ascii_hexdigit()) {
return RevType::Commit(r.to_string());
}
// Semver pattern (v followed by digit) = tag
if r.starts_with('v') && r.chars().nth(1).map_or(false, |c| c.is_ascii_digit()) {
return RevType::Tag(r.to_string());
}
// Otherwise assume branch
RevType::Branch(r.to_string())
}
}
}
/// Check if this rev is pinned (should not auto-update)
pub fn is_pinned(&self) -> bool {
matches!(self, RevType::Commit(_) | RevType::Tag(_))
}
/// Check if this rev is tracking (should auto-update)
pub fn is_tracking(&self) -> bool {
matches!(self, RevType::Branch(_) | RevType::Default)
}
}
/// Result of an update operation
#[derive(Debug)]
pub enum UpdateResult {
/// Updates applied, returns new commit hash
Updated(String),
/// Already at latest version
UpToDate,
/// This rev is pinned (commit/tag) and should not be updated
Pinned,
}
/// Git operations wrapper using git2
pub struct GitOps;
impl GitOps {
/// Initialize a new git repository at the given path
pub fn init(path: &Path) -> Result<Git2Repo> {
let repo = Git2Repo::init(path)?;
Ok(repo)
}
/// Add all files and create initial commit
pub fn add_all_and_commit(repo: &Git2Repo, message: &str) -> Result<git2::Oid> {
let mut index = repo.index()?;
index.add_all(["."], git2::IndexAddOption::DEFAULT, None)?;
index.write()?;
let tree_id = index.write_tree()?;
let tree = repo.find_tree(tree_id)?;
// Create signature
let sig = git2::Signature::now("empeve", "empeve@localhost")?;
let commit_id = repo.commit(
Some("HEAD"),
&sig,
&sig,
message,
&tree,
&[], // No parents for initial commit
)?;
Ok(commit_id)
}
/// Clone a repository to the specified path
pub fn clone(url: &str, destination: &Path, shallow: bool, branch: Option<&str>) -> Result<Git2Repo> {
let mut builder = RepoBuilder::new();
@@ -54,6 +139,38 @@ impl GitOps {
Ok(())
}
/// Fetch updates for a specific revision
/// If rev looks like a branch, fetch that branch; otherwise fetch HEAD
pub fn fetch_rev(repo: &Git2Repo, rev: Option<&str>) -> Result<()> {
let mut remote = repo.find_remote("origin")?;
let mut fetch_options = FetchOptions::new();
let refspec = match rev {
Some(r) if Self::looks_like_branch(r) => {
format!("+refs/heads/{}:refs/remotes/origin/{}", r, r)
}
_ => "HEAD".to_string(),
};
remote.fetch(&[&refspec], Some(&mut fetch_options), None)?;
Ok(())
}
/// Check if a revision string looks like a branch name
/// Returns false for commit hashes (40-char hex) and semver tags (vX.Y.Z)
fn looks_like_branch(rev: &str) -> bool {
// 40-char hex = commit hash
if rev.len() == 40 && rev.chars().all(|c| c.is_ascii_hexdigit()) {
return false;
}
// Semver pattern (v followed by digit) = tag
if rev.starts_with('v') && rev.chars().nth(1).map_or(false, |c| c.is_ascii_digit()) {
return false;
}
// Otherwise assume branch
true
}
/// Check if there are updates available (after fetch)
pub fn has_updates(repo: &Git2Repo) -> Result<bool> {
let head = repo.head()?.peel_to_commit()?;
@@ -84,15 +201,18 @@ impl GitOps {
let fetch_commit = fetch_head.peel_to_commit()?;
let fetch_commit_id = fetch_commit.id();
// Update HEAD to point to the new commit
repo.set_head_detached(fetch_commit_id)?;
let head_ref = repo.head()?;
// Checkout the new HEAD
repo.checkout_head(Some(
git2::build::CheckoutBuilder::default()
.force()
))?;
if head_ref.is_branch() {
// Update branch reference to new commit
let reference_name = head_ref.name().unwrap_or("HEAD");
repo.reference(reference_name, fetch_commit_id, true, "empeve: fast-forward")?;
} else {
// Detached HEAD (pinned) - update detached
repo.set_head_detached(fetch_commit_id)?;
}
repo.checkout_head(Some(git2::build::CheckoutBuilder::default().force()))?;
Ok(fetch_commit_id.to_string())
}
@@ -100,6 +220,74 @@ impl GitOps {
pub fn is_repo(path: &Path) -> bool {
path.join(".git").is_dir() || Git2Repo::open(path).is_ok()
}
/// Clone a repository and checkout the specified revision
pub fn clone_with_rev(
url: &str,
destination: &Path,
shallow: bool,
rev: Option<&str>,
) -> Result<Git2Repo> {
let rev_type = RevType::from_rev(rev);
match rev_type {
RevType::Default => {
// Clone default branch
Self::clone(url, destination, shallow, None)
}
RevType::Branch(ref branch) => {
// Clone with specific branch
Self::clone(url, destination, shallow, Some(branch))
}
RevType::Tag(ref tag) => {
// Clone then checkout tag
let repo = Self::clone(url, destination, shallow, None)?;
Self::checkout_ref(&repo, tag)?;
Ok(repo)
}
RevType::Commit(ref commit) => {
// For commits, we need a full clone (not shallow) to find the commit
let repo = Self::clone(url, destination, false, None)?;
Self::checkout_ref(&repo, commit)?;
Ok(repo)
}
}
}
/// Checkout a specific reference (tag, commit, or branch)
fn checkout_ref(repo: &Git2Repo, refspec: &str) -> Result<()> {
let object = repo.revparse_single(refspec)?;
let commit = object.peel_to_commit()?;
// Detach HEAD to the commit
repo.set_head_detached(commit.id())?;
repo.checkout_head(Some(
git2::build::CheckoutBuilder::default().force()
))?;
Ok(())
}
/// Check and apply updates based on revision type
pub fn update_for_rev(repo: &Git2Repo, rev: Option<&str>) -> Result<UpdateResult> {
let rev_type = RevType::from_rev(rev);
match rev_type {
RevType::Commit(_) | RevType::Tag(_) => {
// Pinned - don't update
Ok(UpdateResult::Pinned)
}
RevType::Default | RevType::Branch(_) => {
// Tracking - check and apply updates
if Self::has_updates(repo)? {
let new_commit = Self::fast_forward(repo)?;
Ok(UpdateResult::Updated(new_commit))
} else {
Ok(UpdateResult::UpToDate)
}
}
}
}
}
#[cfg(test)]

View File

@@ -4,7 +4,7 @@ use crate::config::RepoEntry;
use crate::error::Result;
use crate::paths::Paths;
use super::git_ops::GitOps;
use super::git_ops::{GitOps, RevType, UpdateResult};
/// Represents a repository with its local state
#[derive(Debug)]
@@ -54,12 +54,41 @@ impl Repository {
}
}
/// Get the rev type for this repository
pub fn rev_type(&self) -> RevType {
RevType::from_rev(self.entry.rev.as_deref())
}
/// Check if this repository is pinned (commit or tag)
pub fn is_pinned(&self) -> bool {
self.rev_type().is_pinned()
}
/// Clone this repository
pub fn clone(&mut self, shallow: bool) -> Result<()> {
let url = self.entry.git_url();
let branch = self.entry.rev.as_deref();
// Local repos should not be cloned
if self.entry.is_local() {
// For local repos, just mark as cloned if directory exists
if self.local_path.exists() {
self.is_cloned = true;
return Ok(());
} else {
return Err(crate::error::EmpveError::Config(
"Local repository directory does not exist".into()
));
}
}
GitOps::clone(&url, &self.local_path, shallow, branch)?;
let url = self.entry.git_url();
let rev = self.entry.rev.as_deref();
// For pinned commits, disable shallow clone
let effective_shallow = match self.rev_type() {
RevType::Commit(_) => false,
_ => shallow,
};
GitOps::clone_with_rev(&url, &self.local_path, effective_shallow, rev)?;
self.is_cloned = true;
Ok(())
@@ -84,7 +113,7 @@ impl Repository {
/// Fetch updates from remote
pub fn fetch(&self) -> Result<()> {
let repo = self.open()?;
GitOps::fetch(&repo)?;
GitOps::fetch_rev(&repo, self.entry.rev.as_deref())?;
Ok(())
}
@@ -94,10 +123,10 @@ impl Repository {
GitOps::has_updates(&repo)
}
/// Update to the latest version
pub fn update(&self) -> Result<String> {
/// Update to the latest version (rev-aware)
pub fn update(&self) -> Result<UpdateResult> {
let repo = self.open()?;
GitOps::fast_forward(&repo)
GitOps::update_for_rev(&repo, self.entry.rev.as_deref())
}
/// Delete the local clone