refactor: remove dead code, add LLM CLI flags, rewrite README

Remove unused crypto module, DataDir/DefaultDBPath (SQLite remnant),
and ListenAndServe (replaced by direct http.Server in main). Strip 17
unreferenced i18n keys from en/de translations. Add --llm-provider,
--llm-model, and --llm-endpoint CLI flags for runtime LLM override
without a config file. Rewrite README with correct Go 1.25 version,
shields, LLM providers table, Docker/Helm deployment docs. Fix
.gitignore pattern to not match cmd/heatguard/ directory.
This commit is contained in:
2026-02-10 05:05:53 +01:00
parent d07f96fb91
commit 8c54852cae
8 changed files with 86 additions and 221 deletions

2
.gitignore vendored
View File

@@ -1,4 +1,4 @@
bin/
node_modules/
web/css/app.css
heatguard
/heatguard

114
README.md
View File

@@ -2,7 +2,9 @@
Personalized heat preparedness for your home. HeatGuard analyzes your living spaces, fetches weather forecasts, and generates hour-by-hour action plans to keep you safe during heat events.
![Go](https://img.shields.io/badge/Go-1.25-00ADD8?logo=go&logoColor=white)
![License](https://img.shields.io/badge/license-GPL--3.0-blue)
![Docker](https://img.shields.io/badge/Docker-ready-2496ED?logo=docker&logoColor=white)
## Features
@@ -10,7 +12,7 @@ Personalized heat preparedness for your home. HeatGuard analyzes your living spa
- **Risk assessment** — 4-tier risk levels (low/moderate/high/extreme) with time windows
- **24h SVG temperature timeline** — color-coded area chart with budget status strip
- **Weather integration** — Open-Meteo forecasts + DWD severe weather warnings
- **AI summary** — optional LLM-powered 3-bullet daily briefing (Anthropic, OpenAI, Gemini)
- **AI summary** — optional LLM-powered daily briefing (Anthropic, OpenAI, Gemini, Ollama)
- **Care checklist** — automatic reminders when vulnerable occupants are present
- **Multilingual** — English and German, switchable in-app
- **Privacy-first** — all user data stays in the browser (IndexedDB), server is stateless
@@ -33,19 +35,14 @@ The Go server embeds all web assets (templates, JS, CSS, i18n) and serves them d
### Prerequisites
- Go 1.23+
- Go 1.25+
- Node.js 18+ (for Tailwind CSS build)
### Build & Run
```bash
# Install frontend dependencies
npm install
# Build (compiles CSS + Go binary)
make build
# Run
./bin/heatguard
```
@@ -59,6 +56,22 @@ make dev
Serves files from the filesystem (hot-reload templates/JS) on port 8080.
## CLI Flags
| Flag | Default | Description |
|------|---------|-------------|
| `-port` | `8080` | HTTP listen port |
| `-dev` | `false` | Development mode (serve from filesystem) |
| `-llm-provider` | `""` | LLM provider (`anthropic`, `openai`, `gemini`, `ollama`, `none`) |
| `-llm-model` | `""` | Model name override |
| `-llm-endpoint` | `""` | API endpoint override (for Ollama) |
Example — run with a local Ollama instance:
```bash
./bin/heatguard --llm-provider ollama --llm-model llama3.2
```
## Configuration
HeatGuard works out of the box with zero configuration. Optional server-side config for LLM:
@@ -66,21 +79,32 @@ HeatGuard works out of the box with zero configuration. Optional server-side con
```yaml
# ~/.config/heatwave/config.yaml
llm:
provider: anthropic # anthropic | openai | ollama | none
provider: anthropic # anthropic | openai | gemini | ollama | none
model: claude-sonnet-4-5-20250929
# endpoint: http://localhost:11434 # for ollama
```
API keys for LLM providers can also be configured directly in the browser under **Setup > AI Summary**, stored locally in IndexedDB.
API keys via environment variables:
### CLI Flags
| Provider | Variable |
|----------|----------|
| Anthropic | `ANTHROPIC_API_KEY` |
| OpenAI | `OPENAI_API_KEY` |
| Gemini | `GEMINI_API_KEY` |
| Flag | Default | Description |
|------|---------|-------------|
| `-port` | `8080` | HTTP listen port |
| `-dev` | `false` | Development mode (serve from filesystem) |
API keys can also be configured directly in the browser under **Setup > AI Summary**, stored locally in IndexedDB.
## Deploy
## LLM Providers
| Provider | Auth | Default Model | Notes |
|----------|------|---------------|-------|
| Anthropic | `ANTHROPIC_API_KEY` | claude-sonnet-4-5-20250929 | Cloud API |
| OpenAI | `OPENAI_API_KEY` | gpt-4o | Cloud API |
| Gemini | `GEMINI_API_KEY` | gemini-2.0-flash | Cloud API |
| Ollama | None (local) | — | Set `-llm-endpoint` if not `http://localhost:11434` |
| None | — | — | Default. AI features disabled. |
## Deployment
### Standalone Binary
@@ -103,6 +127,7 @@ After=network.target
Type=simple
User=heatguard
ExecStart=/opt/heatguard/heatguard -port 8080
Environment=ANTHROPIC_API_KEY=sk-...
Restart=on-failure
RestartSec=5
@@ -115,54 +140,33 @@ sudo systemctl daemon-reload
sudo systemctl enable --now heatguard
```
### Behind a Reverse Proxy (nginx)
```nginx
server {
listen 443 ssl;
server_name heatguard.example.com;
ssl_certificate /etc/letsencrypt/live/heatguard.example.com/fullchain.pem;
ssl_certificate_key /etc/letsencrypt/live/heatguard.example.com/privkey.pem;
location / {
proxy_pass http://127.0.0.1:8080;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
}
}
```
### Docker
```dockerfile
FROM golang:1.23-alpine AS build
RUN apk add --no-cache nodejs npm
WORKDIR /src
COPY . .
RUN npm install && make build
FROM alpine:3.20
COPY --from=build /src/bin/heatguard /usr/local/bin/
EXPOSE 8080
ENTRYPOINT ["heatguard"]
```
```bash
docker build -t heatguard .
docker run -d -p 8080:8080 heatguard
```
## Usage Workflow
With an LLM provider:
1. **Create a profile** — name + coordinates (auto-detect via browser geolocation)
2. **Add rooms** — area, orientation, windows, insulation, indoor target temp
3. **Add devices & occupants** — heat-producing equipment and people per room
4. **Configure AC units** — capacity, type, room assignments
5. **Fetch forecast** — pulls 3-day hourly weather data
6. **View dashboard** — risk level, temperature timeline, room budgets, care checklist
```bash
docker run -d -p 8080:8080 \
-e ANTHROPIC_API_KEY=sk-... \
heatguard --llm-provider anthropic
```
The Dockerfile uses a multi-stage build (`golang:1.25-alpine` builder + `distroless/static` runtime) for a minimal image.
### Kubernetes / Helm
A Helm chart is provided in `helm/heatguard/`:
```bash
helm install heatguard ./helm/heatguard \
--set env.ANTHROPIC_API_KEY=sk-...
```
See `helm/heatguard/values.yaml` for all configurable values (replicas, ingress, resources, etc.).
## Development

View File

@@ -19,10 +19,23 @@ import (
func main() {
port := flag.Int("port", 8080, "HTTP port")
dev := flag.Bool("dev", false, "development mode (serve from filesystem)")
llmProvider := flag.String("llm-provider", "", "LLM provider (anthropic, openai, gemini, ollama, none)")
llmModel := flag.String("llm-model", "", "LLM model name override")
llmEndpoint := flag.String("llm-endpoint", "", "LLM API endpoint override (for ollama)")
flag.Parse()
cfg := config.Load()
if *llmProvider != "" {
cfg.LLM.Provider = *llmProvider
}
if *llmModel != "" {
cfg.LLM.Model = *llmModel
}
if *llmEndpoint != "" {
cfg.LLM.Endpoint = *llmEndpoint
}
// Set the embedded filesystem for the server package
server.WebFS = web.FS

View File

@@ -35,20 +35,6 @@ func ConfigDir() string {
return filepath.Join(home, ".config", "heatwave")
}
// DataDir returns the XDG data directory for heatwave.
func DataDir() string {
if xdg := os.Getenv("XDG_DATA_HOME"); xdg != "" {
return filepath.Join(xdg, "heatwave")
}
home, _ := os.UserHomeDir()
return filepath.Join(home, ".local", "share", "heatwave")
}
// DefaultDBPath returns the default SQLite database path.
func DefaultDBPath() string {
return filepath.Join(DataDir(), "heatwave.db")
}
// Load reads the config file from the config directory.
func Load() Config {
cfg := DefaultConfig()

View File

@@ -1,99 +0,0 @@
package config
import (
"crypto/aes"
"crypto/cipher"
"crypto/rand"
"encoding/base64"
"fmt"
"io"
"os"
"path/filepath"
)
const keyFileName = "encryption.key"
// keyPath returns the full path to the encryption key file.
func keyPath() string {
return filepath.Join(ConfigDir(), keyFileName)
}
// loadOrCreateKey reads the 32-byte AES key, creating it if absent.
func loadOrCreateKey() ([]byte, error) {
path := keyPath()
data, err := os.ReadFile(path)
if err == nil && len(data) == 32 {
return data, nil
}
key := make([]byte, 32)
if _, err := io.ReadFull(rand.Reader, key); err != nil {
return nil, fmt.Errorf("generate key: %w", err)
}
if err := os.MkdirAll(filepath.Dir(path), 0o700); err != nil {
return nil, fmt.Errorf("create config dir: %w", err)
}
if err := os.WriteFile(path, key, 0o600); err != nil {
return nil, fmt.Errorf("write key file: %w", err)
}
return key, nil
}
// Encrypt encrypts plaintext using AES-256-GCM and returns a base64 string.
func Encrypt(plaintext string) (string, error) {
if plaintext == "" {
return "", nil
}
key, err := loadOrCreateKey()
if err != nil {
return "", err
}
block, err := aes.NewCipher(key)
if err != nil {
return "", fmt.Errorf("new cipher: %w", err)
}
gcm, err := cipher.NewGCM(block)
if err != nil {
return "", fmt.Errorf("new gcm: %w", err)
}
nonce := make([]byte, gcm.NonceSize())
if _, err := io.ReadFull(rand.Reader, nonce); err != nil {
return "", fmt.Errorf("generate nonce: %w", err)
}
ciphertext := gcm.Seal(nonce, nonce, []byte(plaintext), nil)
return base64.StdEncoding.EncodeToString(ciphertext), nil
}
// Decrypt decrypts a base64 AES-256-GCM ciphertext back to plaintext.
func Decrypt(encoded string) (string, error) {
if encoded == "" {
return "", nil
}
key, err := loadOrCreateKey()
if err != nil {
return "", err
}
ciphertext, err := base64.StdEncoding.DecodeString(encoded)
if err != nil {
return "", fmt.Errorf("decode base64: %w", err)
}
block, err := aes.NewCipher(key)
if err != nil {
return "", fmt.Errorf("new cipher: %w", err)
}
gcm, err := cipher.NewGCM(block)
if err != nil {
return "", fmt.Errorf("new gcm: %w", err)
}
nonceSize := gcm.NonceSize()
if len(ciphertext) < nonceSize {
return "", fmt.Errorf("ciphertext too short")
}
nonce, ct := ciphertext[:nonceSize], ciphertext[nonceSize:]
plaintext, err := gcm.Open(nil, nonce, ct, nil)
if err != nil {
return "", fmt.Errorf("decrypt: %w", err)
}
return string(plaintext), nil
}

View File

@@ -106,11 +106,6 @@ func (s *Server) Handler() http.Handler {
return s.mux
}
// ListenAndServe starts the server.
func (s *Server) ListenAndServe(addr string) error {
return http.ListenAndServe(addr, s.mux)
}
type pageData struct {
Lang string
Page string

View File

@@ -137,9 +137,6 @@
"apiKeyPlaceholder": "API-Schl\u00fcssel eingeben",
"modelPlaceholder": "Modellname (leer lassen f\u00fcr Standard)",
"save": "Einstellungen speichern",
"serverProvider": "Server-Anbieter",
"configured": "Auf Server konfiguriert",
"notConfigured": "Kein KI-Anbieter auf dem Server konfiguriert.",
"providerOptions": { "anthropic": "Anthropic", "openai": "OpenAI", "gemini": "Google Gemini" }
}
},
@@ -157,7 +154,6 @@
"riskWindows": "Risikozeitr\u00e4ume",
"llmSummary": "KI-Zusammenfassung",
"noData": "Noch keine Daten. Richten Sie Ihr Profil ein und rufen Sie eine Vorhersage ab.",
"getStarted": "Loslegen",
"goToSetup": "Zur Einrichtung",
"goToGuide": "Anleitung lesen",
"loading": "Laden\u2026",
@@ -169,43 +165,33 @@
"totalGain": "Gesamtgewinn",
"acCapacity": "Klimaleistung",
"headroom": "Reserve",
"comfortable": "Komfortabel",
"marginal": "Grenzwertig",
"overloaded": "\u00dcberlastet",
"fetchForecastFirst": "Keine Vorhersagedaten. Rufen Sie zuerst eine Vorhersage in der Einrichtung ab.",
"yes": "Ja",
"no": "Nein",
"riskLow": "Niedrig",
"riskModerate": "Mittel",
"riskHigh": "Hoch",
"riskExtreme": "Extrem",
"noActions": "Keine Maßnahmen",
"noActions": "Keine Ma\u00dfnahmen",
"effort": "Aufwand",
"impact": "Wirkung",
"aiDisclaimer": "KI-generierte Zusammenfassung. Kein Ersatz für professionelle Beratung.",
"aiDisclaimer": "KI-generierte Zusammenfassung. Kein Ersatz f\u00fcr professionelle Beratung.",
"riskComfort": "Komfortabel",
"coolComfort": "Keine Kühlung nötig",
"coolVentilate": "Fenster öffnen",
"coolComfort": "Keine K\u00fchlung n\u00f6tig",
"coolVentilate": "Fenster \u00f6ffnen",
"coolAC": "Klimaanlage",
"coolOverloaded": "Klima überlastet",
"coolOverloaded": "Klima \u00fcberlastet",
"coolSealed": "Geschlossen halten",
"aiActions": "KI-empfohlene Maßnahmen",
"aiActions": "KI-empfohlene Ma\u00dfnahmen",
"legendTemp": "Temperatur",
"legendCooling": "Kühlung",
"legendCooling": "K\u00fchlung",
"refreshForecast": "Vorhersage aktualisieren",
"refreshing": "Aktualisierung\u2026",
"forecastRefreshed": "Vorhersage aktualisiert",
"quickSettings": "Schnelleinstellungen",
"qsIndoorTemp": "Raumtemperatur (\u00b0C)",
"qsIndoorHumidity": "Luftfeuchtigkeit (%)",
"qsApply": "Anwenden",
"legendAI": "KI-Maßnahmen",
"legendAI": "KI-Ma\u00dfnahmen",
"category": {
"shading": "Verschattung",
"ventilation": "Lüftung",
"internal_gains": "Wärmequellen",
"ventilation": "L\u00fcftung",
"internal_gains": "W\u00e4rmequellen",
"ac_strategy": "Klimastrategie",
"hydration": "Flüssigkeit",
"hydration": "Fl\u00fcssigkeit",
"care": "Pflege"
}
},
@@ -271,13 +257,10 @@
"cancel": "Abbrechen",
"delete": "L\u00f6schen",
"edit": "Bearbeiten",
"saving": "Speichern\u2026",
"saved": "Gespeichert",
"error": "Etwas ist schiefgelaufen.",
"confirm": "Sind Sie sicher?",
"loading": "Laden\u2026",
"noProfile": "Kein Profil ausgew\u00e4hlt.",
"watts": "W",
"btuh": "BTU/h"
"watts": "W"
}
}

View File

@@ -137,9 +137,6 @@
"apiKeyPlaceholder": "Enter API key",
"modelPlaceholder": "Model name (leave blank for default)",
"save": "Save Settings",
"serverProvider": "Server provider",
"configured": "Configured on server",
"notConfigured": "No AI provider configured on the server.",
"providerOptions": { "anthropic": "Anthropic", "openai": "OpenAI", "gemini": "Google Gemini" }
}
},
@@ -157,7 +154,6 @@
"riskWindows": "Risk Windows",
"llmSummary": "AI Summary",
"noData": "No data yet. Set up your profile and fetch a forecast.",
"getStarted": "Get Started",
"goToSetup": "Go to Setup",
"goToGuide": "Read the Guide",
"loading": "Loading\u2026",
@@ -169,16 +165,8 @@
"totalGain": "Total Gain",
"acCapacity": "AC Capacity",
"headroom": "Headroom",
"comfortable": "Comfortable",
"marginal": "Marginal",
"overloaded": "Overloaded",
"fetchForecastFirst": "No forecast data. Fetch a forecast in Setup first.",
"yes": "Yes",
"no": "No",
"riskLow": "Low",
"riskModerate": "Moderate",
"riskHigh": "High",
"riskExtreme": "Extreme",
"noActions": "No actions",
"effort": "Effort",
"impact": "Impact",
@@ -193,8 +181,6 @@
"legendTemp": "Temperature",
"legendCooling": "Cooling",
"refreshForecast": "Refresh Forecast",
"refreshing": "Refreshing\u2026",
"forecastRefreshed": "Forecast refreshed",
"quickSettings": "Quick Settings",
"qsIndoorTemp": "Indoor Temp (\u00b0C)",
"qsIndoorHumidity": "Indoor Humidity (%)",
@@ -271,13 +257,10 @@
"cancel": "Cancel",
"delete": "Delete",
"edit": "Edit",
"saving": "Saving\u2026",
"saved": "Saved",
"error": "Something went wrong.",
"confirm": "Are you sure?",
"loading": "Loading\u2026",
"noProfile": "No profile selected.",
"watts": "W",
"btuh": "BTU/h"
"watts": "W"
}
}