Add Pi-hole with AdGuard DOH/DOT integration, reorganize swarm stacks, add DNS/n8n docs
This commit is contained in:
237
services/n8n/workflows/README.md
Normal file
237
services/n8n/workflows/README.md
Normal file
@@ -0,0 +1,237 @@
|
||||
# Homelab n8n Monitoring Workflows
|
||||
|
||||
This directory contains intelligent n8n workflows for monitoring and integrating your homelab infrastructure using AI-powered analysis.
|
||||
|
||||
## 📋 Workflows
|
||||
|
||||
### 1. **Homelab Health Monitor** (`homelab-health-monitor.json`)
|
||||
**Purpose:** Comprehensive health monitoring of all homelab services
|
||||
**Schedule:** Every 15 minutes (or manual via webhook)
|
||||
**Features:**
|
||||
- Network connectivity checks (internet + internal DNS)
|
||||
- Docker Swarm service status monitoring
|
||||
- Service endpoint validation (Komodo, OpenWebUI, Paperless, Prometheus, LM Studio)
|
||||
- AI-powered health analysis using LM Studio
|
||||
- Health scoring (0-100) and automated alerting
|
||||
|
||||
**Webhook:** `POST https://n8n.sj98.duckdns.org/webhook/health-check`
|
||||
|
||||
### 2. **Homelab Log Analyzer** (`homelab-log-analyzer.json`)
|
||||
**Purpose:** Automated AI analysis of Docker service logs
|
||||
**Schedule:** Every 6 hours
|
||||
**Features:**
|
||||
- Collects logs from critical services (Traefik, n8n, OpenWebUI, Komodo, Prometheus)
|
||||
- Parses ERROR, WARN, CRITICAL patterns
|
||||
- AI analysis of log patterns and issues
|
||||
- Generates actionable recommendations
|
||||
- Alerts on high error counts
|
||||
|
||||
> **💡 For Manual Log Viewing:** Use **Dozzle** at your configured URL for real-time, interactive log viewing with a beautiful web interface. This workflow is for automated AI-powered analysis and alerting.
|
||||
|
||||
### 3. **Homelab Integration Advisor** (`homelab-integration-advisor.json`)
|
||||
**Purpose:** AI-powered service integration recommendations
|
||||
**Schedule:** Daily at 9 AM (or manual via webhook)
|
||||
**Features:**
|
||||
- Discovers all running services and capabilities
|
||||
- Identifies integration opportunities
|
||||
- AI generates specific n8n workflow patterns
|
||||
- Prioritizes by complexity and value
|
||||
- Provides step-by-step implementation guidance
|
||||
|
||||
**Webhook:** `POST https://n8n.sj98.duckdns.org/webhook/integration-advisor`
|
||||
|
||||
## 🚀 Installation
|
||||
|
||||
### 1. Import Workflows
|
||||
```bash
|
||||
# Option A: Via n8n UI
|
||||
1. Open n8n at https://n8n.sj98.duckdns.org
|
||||
2. Click "Workflows" → "Import from File"
|
||||
3. Select each JSON file from this directory
|
||||
|
||||
# Option B: Via API (if API enabled)
|
||||
cd /workspace/homelab/services/n8n/workflows
|
||||
curl -X POST https://n8n.sj98.duckdns.org/api/v1/workflows \
|
||||
-H "Content-Type: application/json" \
|
||||
-H "X-N8N-API-KEY: your-api-key" \
|
||||
-d @homelab-health-monitor.json
|
||||
```
|
||||
|
||||
### 2. Configure AI Model
|
||||
Edit each workflow and set your preferred LM Studio model:
|
||||
- **Health Monitor:** Uses `deepseek-r1-distill-llama-8b` (reasoning)
|
||||
- **Log Analyzer:** Uses `qwen2.5-coder-7b-instruct` (technical analysis)
|
||||
- **Integration Advisor:** Uses `deepseek-r1-distill-llama-8b` (planning)
|
||||
|
||||
Available models on your LM Studio instance (.81:1234):
|
||||
- `deepseek-r1-distill-llama-8b`
|
||||
- `qwen2.5-coder-7b-instruct`
|
||||
- `qwen/qwen3-coder-30b`
|
||||
- `mistralai/codestral-22b-v0.1`
|
||||
- `google/gemma-3-12b`
|
||||
|
||||
### 3. Activate Workflows
|
||||
1. Open each workflow
|
||||
2. Toggle "Active" switch in top right
|
||||
3. Verify schedule trigger is enabled
|
||||
|
||||
## 🔧 Configuration
|
||||
|
||||
### LM Studio Connection
|
||||
The workflows connect to LM Studio via the `lm-studio` hostname (mapped to 192.168.1.81:1234 via `extra_hosts` in n8n-stack.yml).
|
||||
|
||||
**Test connection:**
|
||||
```bash
|
||||
docker exec <n8n-container-id> curl http://lm-studio:1234/v1/models
|
||||
```
|
||||
|
||||
### Notifications (Optional)
|
||||
To enable alerts, add these nodes to each workflow:
|
||||
- **Email:** Use n8n's Email node with SMTP credentials
|
||||
- **Discord:** Use Webhook node with Discord webhook URL
|
||||
- **Slack:** Use Slack node with OAuth credentials
|
||||
- **Home Assistant:** Send to `http://homeassistant.local:8123/api/webhook/n8n-alert`
|
||||
|
||||
## 📊 Recommended Integration Patterns
|
||||
|
||||
Based on your homelab services, here are high-value integrations to implement:
|
||||
|
||||
### 1. **AI-Powered Document Processing**
|
||||
**Services:** n8n → Paperless → OpenWebUI
|
||||
**Pattern:** Auto-tag and summarize uploaded documents using AI
|
||||
```
|
||||
Trigger: Paperless webhook (new document)
|
||||
→ Get document content
|
||||
→ Send to LM Studio for tagging/summary
|
||||
→ Update Paperless tags and notes
|
||||
```
|
||||
|
||||
### 2. **Metric-Based Automation**
|
||||
**Services:** Prometheus → n8n → Docker/Komodo
|
||||
**Pattern:** Auto-restart services on high resource usage
|
||||
```
|
||||
Trigger: Prometheus AlertManager webhook
|
||||
→ Parse alert (high CPU/memory)
|
||||
→ Execute docker service update --force <service>
|
||||
→ Send notification
|
||||
```
|
||||
|
||||
### 3. **Smart Search Integration**
|
||||
**Services:** SearXNG → OpenWebUI
|
||||
**Pattern:** Enhanced AI chat with web search capability
|
||||
```
|
||||
Trigger: OpenWebUI webhook or manual
|
||||
→ Query SearXNG for context
|
||||
→ Send results + query to LM Studio
|
||||
→ Return AI response with citations
|
||||
```
|
||||
|
||||
### 4. **Backup Automation**
|
||||
**Services:** n8n → All Services → Storage
|
||||
**Pattern:** Automated backup verification and reporting
|
||||
```
|
||||
Schedule: Daily at 2 AM
|
||||
→ Trigger OMV backup scripts
|
||||
→ Verify backup completion
|
||||
→ Calculate backup sizes
|
||||
→ AI analysis of backup health
|
||||
→ Send report
|
||||
```
|
||||
|
||||
### 5. **Development Pipeline**
|
||||
**Services:** Gitea → Komodo → n8n
|
||||
**Pattern:** GitOps deployment automation
|
||||
```
|
||||
Trigger: Gitea webhook (push to main)
|
||||
→ Parse commit info
|
||||
→ Trigger Komodo deployment
|
||||
→ Monitor deployment status
|
||||
→ Run health checks
|
||||
→ Send notification
|
||||
```
|
||||
|
||||
## 🐛 Troubleshooting
|
||||
|
||||
### Connection to LM Studio Fails
|
||||
```bash
|
||||
# Check if extra_hosts is configured
|
||||
docker service inspect n8n_n8n | grep -A 5 ExtraHosts
|
||||
|
||||
# Test from n8n container
|
||||
docker exec $(docker ps -q -f name=n8n) curl http://lm-studio:1234/v1/models
|
||||
|
||||
# Verify LM Studio is running on .81
|
||||
curl http://192.168.1.81:1234/v1/models
|
||||
```
|
||||
|
||||
### Docker Commands Fail
|
||||
```bash
|
||||
# Verify Docker socket is mounted
|
||||
docker service inspect n8n_n8n | grep -A 2 docker.sock
|
||||
|
||||
# Test from n8n container
|
||||
docker exec $(docker ps -q -f name=n8n) docker ps
|
||||
```
|
||||
|
||||
### Workflows Don't Execute
|
||||
- Check n8n logs: `docker service logs n8n_n8n --tail 100`
|
||||
- Verify workflow is activated (toggle in UI)
|
||||
- Check schedule trigger settings
|
||||
- Ensure n8n has sufficient resources (increase memory/CPU limits)
|
||||
|
||||
## <20> Log Viewing
|
||||
|
||||
### Interactive Log Viewing with Dozzle
|
||||
For **manual, real-time log viewing**, use **Dozzle** - it's already part of your homelab:
|
||||
|
||||
**Access:** Check your Traefik/Portainer configuration for the Dozzle URL
|
||||
|
||||
**Features:**
|
||||
- Real-time log streaming with color coding
|
||||
- Multi-container view
|
||||
- Search and filter logs
|
||||
- No configuration needed - automatically discovers containers
|
||||
- Beautiful, responsive web UI
|
||||
|
||||
**Use Dozzle when you need to:**
|
||||
- Investigate specific issues in real-time
|
||||
- Follow logs during deployments
|
||||
- Debug container startup problems
|
||||
- Search for specific error messages
|
||||
|
||||
### Automated Log Analysis (This Workflow)
|
||||
The **Homelab Log Analyzer** workflow complements Dozzle by:
|
||||
- Running periodically (every 6 hours) to catch issues you might miss
|
||||
- Using AI to identify patterns across multiple services
|
||||
- Sending proactive alerts before issues escalate
|
||||
- Providing trend analysis over time
|
||||
|
||||
**Both tools serve different purposes and work great together!**
|
||||
|
||||
---
|
||||
|
||||
## <20>📈 Next Steps
|
||||
|
||||
1. **Import and test** each workflow manually
|
||||
2. **Configure notifications** (email/Discord/Slack)
|
||||
3. **Review AI recommendations** from Integration Advisor
|
||||
4. **Implement priority integrations** suggested by AI
|
||||
5. **Monitor health scores** and adjust thresholds
|
||||
6. **Create custom workflows** based on your specific needs
|
||||
|
||||
## 🔗 Useful Links
|
||||
|
||||
- **n8n Documentation:** https://docs.n8n.io
|
||||
- **LM Studio API:** http://lm-studio:1234 (OpenAI-compatible)
|
||||
- **Prometheus API:** http://prometheus.sj98.duckdns.org/api/v1
|
||||
- **Dozzle Logs:** Your Dozzle URL (real-time log viewer)
|
||||
- **Docker API:** Unix socket at `/var/run/docker.sock`
|
||||
|
||||
## 💡 Tips
|
||||
|
||||
- **Use Dozzle for interactive debugging**, workflows for automated monitoring
|
||||
- Start with manual triggers before enabling schedules
|
||||
- Use AI model with appropriate context window for your data
|
||||
- Monitor n8n resource usage - increase limits if needed
|
||||
- Keep workflows modular - easier to debug and maintain
|
||||
- Save successful execution results for reference
|
||||
778
services/n8n/workflows/homelab-health-monitor.json
Normal file
778
services/n8n/workflows/homelab-health-monitor.json
Normal file
@@ -0,0 +1,778 @@
|
||||
{
|
||||
"name": "Homelab Health Monitor",
|
||||
"nodes": [
|
||||
{
|
||||
"parameters": {
|
||||
"rule": {
|
||||
"interval": [
|
||||
{
|
||||
"field": "minutes",
|
||||
"minutesInterval": 15
|
||||
}
|
||||
]
|
||||
}
|
||||
},
|
||||
"id": "schedule-trigger",
|
||||
"name": "Every 15 Minutes",
|
||||
"type": "n8n-nodes-base.scheduleTrigger",
|
||||
"typeVersion": 1.2,
|
||||
"position": [
|
||||
250,
|
||||
300
|
||||
]
|
||||
},
|
||||
{
|
||||
"parameters": {
|
||||
"httpMethod": "POST",
|
||||
"path": "health-check",
|
||||
"responseMode": "responseNode",
|
||||
"options": {}
|
||||
},
|
||||
"id": "webhook-trigger",
|
||||
"name": "Manual Trigger Webhook",
|
||||
"type": "n8n-nodes-base.webhook",
|
||||
"typeVersion": 2,
|
||||
"position": [
|
||||
250,
|
||||
500
|
||||
],
|
||||
"webhookId": "homelab-health"
|
||||
},
|
||||
{
|
||||
"parameters": {
|
||||
"url": "=https://www.google.com",
|
||||
"options": {
|
||||
"timeout": 5000
|
||||
}
|
||||
},
|
||||
"id": "check-internet-dns",
|
||||
"name": "Check Google DNS",
|
||||
"type": "n8n-nodes-base.httpRequest",
|
||||
"typeVersion": 4.2,
|
||||
"position": [
|
||||
500,
|
||||
200
|
||||
],
|
||||
"continueOnFail": true
|
||||
},
|
||||
{
|
||||
"parameters": {
|
||||
"url": "=https://1.1.1.1",
|
||||
"options": {
|
||||
"timeout": 5000
|
||||
}
|
||||
},
|
||||
"id": "check-cloudflare",
|
||||
"name": "Check Cloudflare DNS",
|
||||
"type": "n8n-nodes-base.httpRequest",
|
||||
"typeVersion": 4.2,
|
||||
"position": [
|
||||
500,
|
||||
350
|
||||
],
|
||||
"continueOnFail": true
|
||||
},
|
||||
{
|
||||
"parameters": {
|
||||
"url": "=http://192.168.1.196:80",
|
||||
"options": {
|
||||
"timeout": 3000
|
||||
}
|
||||
},
|
||||
"id": "check-internal-dns-1",
|
||||
"name": "Check Pi-hole .196",
|
||||
"type": "n8n-nodes-base.httpRequest",
|
||||
"typeVersion": 4.2,
|
||||
"position": [
|
||||
500,
|
||||
500
|
||||
],
|
||||
"continueOnFail": true
|
||||
},
|
||||
{
|
||||
"parameters": {
|
||||
"url": "=http://192.168.1.245:80",
|
||||
"options": {
|
||||
"timeout": 3000
|
||||
}
|
||||
},
|
||||
"id": "check-internal-dns-2",
|
||||
"name": "Check Pi-hole .245",
|
||||
"type": "n8n-nodes-base.httpRequest",
|
||||
"typeVersion": 4.2,
|
||||
"position": [
|
||||
500,
|
||||
650
|
||||
],
|
||||
"continueOnFail": true
|
||||
},
|
||||
{
|
||||
"parameters": {
|
||||
"url": "=http://192.168.1.62:80",
|
||||
"options": {
|
||||
"timeout": 3000
|
||||
}
|
||||
},
|
||||
"id": "check-internal-dns-3",
|
||||
"name": "Check Pi-hole .62",
|
||||
"type": "n8n-nodes-base.httpRequest",
|
||||
"typeVersion": 4.2,
|
||||
"position": [
|
||||
500,
|
||||
800
|
||||
],
|
||||
"continueOnFail": true
|
||||
},
|
||||
{
|
||||
"parameters": {
|
||||
"command": "docker service ls --format '{{json .}}'"
|
||||
},
|
||||
"id": "docker-service-list",
|
||||
"name": "Get Docker Services",
|
||||
"type": "n8n-nodes-base.executeCommand",
|
||||
"typeVersion": 1,
|
||||
"position": [
|
||||
750,
|
||||
300
|
||||
]
|
||||
},
|
||||
{
|
||||
"parameters": {
|
||||
"command": "docker node ls --format '{{json .}}'"
|
||||
},
|
||||
"id": "docker-node-list",
|
||||
"name": "Get Swarm Nodes",
|
||||
"type": "n8n-nodes-base.executeCommand",
|
||||
"typeVersion": 1,
|
||||
"position": [
|
||||
750,
|
||||
450
|
||||
]
|
||||
},
|
||||
{
|
||||
"parameters": {
|
||||
"url": "=https://komodo.sj98.duckdns.org",
|
||||
"options": {
|
||||
"timeout": 5000
|
||||
}
|
||||
},
|
||||
"id": "check-komodo",
|
||||
"name": "Check Komodo",
|
||||
"type": "n8n-nodes-base.httpRequest",
|
||||
"typeVersion": 4.2,
|
||||
"position": [
|
||||
1000,
|
||||
200
|
||||
],
|
||||
"continueOnFail": true
|
||||
},
|
||||
{
|
||||
"parameters": {
|
||||
"url": "=https://ai.sj98.duckdns.org/health",
|
||||
"options": {
|
||||
"timeout": 5000
|
||||
}
|
||||
},
|
||||
"id": "check-openwebui",
|
||||
"name": "Check OpenWebUI",
|
||||
"type": "n8n-nodes-base.httpRequest",
|
||||
"typeVersion": 4.2,
|
||||
"position": [
|
||||
1000,
|
||||
350
|
||||
],
|
||||
"continueOnFail": true
|
||||
},
|
||||
{
|
||||
"parameters": {
|
||||
"url": "=https://paperless.sj98.duckdns.org/api",
|
||||
"options": {
|
||||
"timeout": 5000
|
||||
}
|
||||
},
|
||||
"id": "check-paperless",
|
||||
"name": "Check Paperless",
|
||||
"type": "n8n-nodes-base.httpRequest",
|
||||
"typeVersion": 4.2,
|
||||
"position": [
|
||||
1000,
|
||||
500
|
||||
],
|
||||
"continueOnFail": true
|
||||
},
|
||||
{
|
||||
"parameters": {
|
||||
"url": "=https://prometheus.sj98.duckdns.org/-/healthy",
|
||||
"options": {
|
||||
"timeout": 5000
|
||||
}
|
||||
},
|
||||
"id": "check-prometheus",
|
||||
"name": "Check Prometheus",
|
||||
"type": "n8n-nodes-base.httpRequest",
|
||||
"typeVersion": 4.2,
|
||||
"position": [
|
||||
1000,
|
||||
650
|
||||
],
|
||||
"continueOnFail": true
|
||||
},
|
||||
{
|
||||
"parameters": {
|
||||
"command": "curl -sf http://192.168.1.1 > /dev/null && echo '{\"node\": \"Gateway\", \"status\": \"healthy\"}' || echo '{\"node\": \"Gateway\", \"error\": \"unreachable\"}'"
|
||||
},
|
||||
"id": "check-gateway",
|
||||
"name": "Check Gateway",
|
||||
"type": "n8n-nodes-base.executeCommand",
|
||||
"typeVersion": 1,
|
||||
"position": [
|
||||
500,
|
||||
950
|
||||
]
|
||||
},
|
||||
{
|
||||
"parameters": {
|
||||
"command": "metrics=$(curl -s --connect-timeout 2 http://192.168.1.57:9100/metrics | grep -E \"node_load1 |node_memory_MemAvailable_bytes |node_memory_MemTotal_bytes \" | tr '\\n' ',' || echo \"failed\"); echo \"{\\\"node\\\": \\\"Proxmox Host\\\", \\\"metrics\\\": \\\"$metrics\\\"}\""
|
||||
},
|
||||
"id": "check-proxmox",
|
||||
"name": "Check Proxmox",
|
||||
"type": "n8n-nodes-base.executeCommand",
|
||||
"typeVersion": 1,
|
||||
"position": [
|
||||
1000,
|
||||
950
|
||||
]
|
||||
},
|
||||
{
|
||||
"parameters": {
|
||||
"url": "=http://lm-studio:1234/v1/models",
|
||||
"options": {
|
||||
"timeout": 5000
|
||||
}
|
||||
},
|
||||
"id": "check-lm-studio",
|
||||
"name": "Check LM Studio",
|
||||
"type": "n8n-nodes-base.httpRequest",
|
||||
"typeVersion": 4.2,
|
||||
"position": [
|
||||
1000,
|
||||
800
|
||||
],
|
||||
"continueOnFail": true
|
||||
},
|
||||
{
|
||||
"parameters": {
|
||||
"jsCode": "const items = $input.all();\n\nconst healthData = {\n timestamp: new Date().toISOString(),\n network: {\n internet: [],\n internal: [],\n gateway: {}\n },\n docker: {\n services: [],\n nodes: []\n },\n infrastructure: {\n proxmox: {}\n },\n services: []\n};\n\n// Process all health check results\nfor (const item of items) {\n let nodeName = item.json.node || 'unknown';\n const success = !item.json.error;\n \n // Handle Execute Command JSON output (Gateway/Proxmox)\n if (item.json.stdout && item.json.stdout.trim().startsWith('{')) {\n try {\n const parsed = JSON.parse(item.json.stdout);\n if (parsed.node) nodeName = parsed.node;\n if (parsed.metrics) item.json.metrics = parsed.metrics;\n if (parsed.status) item.json.status = parsed.status;\n } catch (e) {}\n }\n\n if (nodeName.includes('DNS') || nodeName.includes('Cloudflare')) {\n healthData.network.internet.push({\n name: nodeName,\n status: success ? 'healthy' : 'unhealthy',\n error: item.json.error || null\n });\n } else if (nodeName.includes('Gateway')) {\n healthData.network.gateway = {\n status: item.json.status || 'unhealthy',\n error: item.json.error || null\n };\n } else if (nodeName.includes('Pi-hole')) {\n healthData.network.internal.push({\n name: nodeName,\n status: success ? 'healthy' : 'unhealthy',\n error: item.json.error || null\n });\n } else if (nodeName.includes('Proxmox')) {\n healthData.infrastructure.proxmox = {\n status: item.json.metrics ? 'healthy' : 'unhealthy',\n metrics: item.json.metrics || null,\n error: item.json.error || null\n };\n } else if (nodeName.includes('Docker')) {\n try {\n const data = JSON.parse(item.json.stdout || '[]');\n if (nodeName.includes('Services')) {\n healthData.docker.services = data;\n } else if (nodeName.includes('Nodes')) {\n healthData.docker.nodes = data;\n }\n } catch (e) {\n healthData.docker.error = e.message;\n }\n } else {\n healthData.services.push({\n name: nodeName,\n status: success ? 'healthy' : 'unhealthy',\n statusCode: item.json.statusCode,\n error: item.json.error || null\n });\n }\n}\n\n// Calculate overall health score (0-100)\nlet totalChecks = 0;\nlet passedChecks = 0;\n\nhealthData.network.internet.forEach(check => {\n totalChecks++;\n if (check.status === 'healthy') passedChecks++;\n});\n\nif (healthData.network.gateway.status === 'healthy') {\n totalChecks++;\n passedChecks++;\n} else if (healthData.network.gateway.status) {\n totalChecks++;\n}\n\nhealthData.network.internal.forEach(check => {\n totalChecks++;\n if (check.status === 'healthy') passedChecks++;\n});\n\nif (healthData.infrastructure.proxmox.status === 'healthy') {\n totalChecks++;\n passedChecks++;\n} else if (healthData.infrastructure.proxmox.status) {\n totalChecks++;\n}\n\nhealthData.services.forEach(service => {\n totalChecks++;\n if (service.status === 'healthy') passedChecks++;\n});\n\nhealthData.healthScore = totalChecks > 0 ? Math.round((passedChecks / totalChecks) * 100) : 0;\nhealthData.summary = `${passedChecks}/${totalChecks} checks passed`;\n\nreturn [{ json: healthData }];"
|
||||
},
|
||||
"id": "aggregate-health",
|
||||
"name": "Aggregate Health Data",
|
||||
"type": "n8n-nodes-base.code",
|
||||
"typeVersion": 2,
|
||||
"position": [
|
||||
1250,
|
||||
500
|
||||
]
|
||||
},
|
||||
{
|
||||
"parameters": {
|
||||
"method": "POST",
|
||||
"url": "=http://lm-studio:1234/v1/chat/completions",
|
||||
"sendBody": true,
|
||||
"bodyParameters": {
|
||||
"parameters": [
|
||||
{
|
||||
"name": "model",
|
||||
"value": "=deepseek-r1-distill-llama-8b"
|
||||
},
|
||||
{
|
||||
"name": "messages",
|
||||
"value": "={{ [{\"role\":\"system\",\"content\":\"You are a homelab infrastructure analyst. Analyze health check data and provide concise insights about system status, potential issues, and recommendations. Respond in JSON format with fields: overall_status, critical_issues (array), warnings (array), recommendations (array).\"}, {\"role\":\"user\",\"content\":\"Analyze this homelab health data:\\n\\n\" + JSON.stringify($json, null, 2)}] }}"
|
||||
},
|
||||
{
|
||||
"name": "temperature",
|
||||
"value": "=0.3"
|
||||
},
|
||||
{
|
||||
"name": "max_tokens",
|
||||
"value": "=1000"
|
||||
}
|
||||
]
|
||||
},
|
||||
"options": {
|
||||
"timeout": 30000
|
||||
}
|
||||
},
|
||||
"id": "ai-analysis",
|
||||
"name": "AI Health Analysis",
|
||||
"type": "n8n-nodes-base.httpRequest",
|
||||
"typeVersion": 4.2,
|
||||
"position": [
|
||||
1500,
|
||||
500
|
||||
]
|
||||
},
|
||||
{
|
||||
"parameters": {
|
||||
"jsCode": "const healthData = $('Aggregate Health Data').item.json;\nconst aiResponse = $json.choices[0].message.content;\n\nlet analysis;\ntry {\n // Try to parse AI response as JSON\n analysis = JSON.parse(aiResponse);\n} catch (e) {\n // If not JSON, structure it\n analysis = {\n overall_status: aiResponse.includes('healthy') ? 'healthy' : 'needs attention',\n raw_response: aiResponse\n };\n}\n\nconst report = {\n generated_at: new Date().toISOString(),\n health_score: healthData.healthScore,\n summary: healthData.summary,\n network_status: {\n internet: healthData.network.internet,\n internal_dns: healthData.network.internal\n },\n docker_swarm: {\n nodes: healthData.docker.nodes.length || 0,\n services: healthData.docker.services.length || 0,\n services_list: healthData.docker.services\n },\n service_endpoints: healthData.services,\n ai_analysis: analysis,\n alert_level: healthData.healthScore < 70 ? 'critical' : healthData.healthScore < 90 ? 'warning' : 'normal'\n};\n\nreturn [{ json: report }];"
|
||||
},
|
||||
"id": "build-report",
|
||||
"name": "Build Final Report",
|
||||
"type": "n8n-nodes-base.code",
|
||||
"typeVersion": 2,
|
||||
"position": [
|
||||
1750,
|
||||
500
|
||||
]
|
||||
},
|
||||
{
|
||||
"parameters": {
|
||||
"conditions": {
|
||||
"options": {
|
||||
"leftValue": "",
|
||||
"caseSensitive": true,
|
||||
"typeValidation": "strict"
|
||||
},
|
||||
"combinator": "or",
|
||||
"conditions": [
|
||||
{
|
||||
"id": "alert-critical",
|
||||
"leftValue": "={{ $json.alert_level }}",
|
||||
"rightValue": "critical",
|
||||
"operator": {
|
||||
"type": "string",
|
||||
"operation": "equals"
|
||||
}
|
||||
},
|
||||
{
|
||||
"id": "alert-warning",
|
||||
"leftValue": "={{ $json.health_score }}",
|
||||
"rightValue": 80,
|
||||
"operator": {
|
||||
"type": "number",
|
||||
"operation": "lt"
|
||||
}
|
||||
}
|
||||
]
|
||||
}
|
||||
},
|
||||
"id": "should-alert",
|
||||
"name": "Should Alert?",
|
||||
"type": "n8n-nodes-base.if",
|
||||
"typeVersion": 2,
|
||||
"position": [
|
||||
2000,
|
||||
500
|
||||
]
|
||||
},
|
||||
{
|
||||
"parameters": {
|
||||
"content": "🚨 **Homelab Health Alert**\\n\\n**Health Score:** {{ $json.health_score }}/100\\n**Status:** {{ $json.alert_level }}\\n**Time:** {{ $json.generated_at }}\\n\\n**Summary:** {{ $json.summary }}\\n\\n**AI Analysis:**\\n{{ $json.ai_analysis.overall_status }}\\n\\n{% if $json.ai_analysis.critical_issues %}**Critical Issues:**\\n{% for issue in $json.ai_analysis.critical_issues %}- {{ issue }}\\n{% endfor %}{% endif %}\\n\\n{% if $json.ai_analysis.recommendations %}**Recommendations:**\\n{% for rec in $json.ai_analysis.recommendations %}- {{ rec }}\\n{% endfor %}{% endif %}",
|
||||
"options": {}
|
||||
},
|
||||
"id": "format-alert",
|
||||
"name": "Format Alert Message",
|
||||
"type": "n8n-nodes-base.markdown",
|
||||
"typeVersion": 1,
|
||||
"position": [
|
||||
2250,
|
||||
400
|
||||
]
|
||||
},
|
||||
{
|
||||
"parameters": {
|
||||
"respondWith": "json",
|
||||
"responseBody": "={{ $json }}"
|
||||
},
|
||||
"id": "webhook-response",
|
||||
"name": "Webhook Response",
|
||||
"type": "n8n-nodes-base.respondToWebhook",
|
||||
"typeVersion": 1,
|
||||
"position": [
|
||||
2250,
|
||||
600
|
||||
]
|
||||
}
|
||||
],
|
||||
"pinData": {},
|
||||
"connections": {
|
||||
"Every 15 Minutes": {
|
||||
"main": [
|
||||
[
|
||||
{
|
||||
"node": "Check Google DNS",
|
||||
"type": "main",
|
||||
"index": 0
|
||||
},
|
||||
{
|
||||
"node": "Check Cloudflare DNS",
|
||||
"type": "main",
|
||||
"index": 0
|
||||
},
|
||||
{
|
||||
"node": "Check Pi-hole .196",
|
||||
"type": "main",
|
||||
"index": 0
|
||||
},
|
||||
{
|
||||
"node": "Check Pi-hole .245",
|
||||
"type": "main",
|
||||
"index": 0
|
||||
},
|
||||
{
|
||||
"node": "Check Pi-hole .62",
|
||||
"type": "main",
|
||||
"index": 0
|
||||
},
|
||||
{
|
||||
"node": "Get Docker Services",
|
||||
"type": "main",
|
||||
"index": 0
|
||||
},
|
||||
{
|
||||
"node": "Get Swarm Nodes",
|
||||
"type": "main",
|
||||
"index": 0
|
||||
},
|
||||
{
|
||||
"node": "Check Komodo",
|
||||
"type": "main",
|
||||
"index": 0
|
||||
},
|
||||
{
|
||||
"node": "Check OpenWebUI",
|
||||
"type": "main",
|
||||
"index": 0
|
||||
},
|
||||
{
|
||||
"node": "Check Paperless",
|
||||
"type": "main",
|
||||
"index": 0
|
||||
},
|
||||
{
|
||||
"node": "Check Prometheus",
|
||||
"type": "main",
|
||||
"index": 0
|
||||
},
|
||||
{
|
||||
"node": "Check LM Studio",
|
||||
"type": "main",
|
||||
"index": 0
|
||||
},
|
||||
{
|
||||
"node": "Check Gateway",
|
||||
"type": "main",
|
||||
"index": 0
|
||||
},
|
||||
{
|
||||
"node": "Check Proxmox",
|
||||
"type": "main",
|
||||
"index": 0
|
||||
}
|
||||
]
|
||||
]
|
||||
},
|
||||
"Manual Trigger Webhook": {
|
||||
"main": [
|
||||
[
|
||||
{
|
||||
"node": "Check Google DNS",
|
||||
"type": "main",
|
||||
"index": 0
|
||||
},
|
||||
{
|
||||
"node": "Check Cloudflare DNS",
|
||||
"type": "main",
|
||||
"index": 0
|
||||
},
|
||||
{
|
||||
"node": "Check Pi-hole .196",
|
||||
"type": "main",
|
||||
"index": 0
|
||||
},
|
||||
{
|
||||
"node": "Check Pi-hole .245",
|
||||
"type": "main",
|
||||
"index": 0
|
||||
},
|
||||
{
|
||||
"node": "Check Pi-hole .62",
|
||||
"type": "main",
|
||||
"index": 0
|
||||
},
|
||||
{
|
||||
"node": "Get Docker Services",
|
||||
"type": "main",
|
||||
"index": 0
|
||||
},
|
||||
{
|
||||
"node": "Get Swarm Nodes",
|
||||
"type": "main",
|
||||
"index": 0
|
||||
},
|
||||
{
|
||||
"node": "Check Komodo",
|
||||
"type": "main",
|
||||
"index": 0
|
||||
},
|
||||
{
|
||||
"node": "Check OpenWebUI",
|
||||
"type": "main",
|
||||
"index": 0
|
||||
},
|
||||
{
|
||||
"node": "Check Paperless",
|
||||
"type": "main",
|
||||
"index": 0
|
||||
},
|
||||
{
|
||||
"node": "Check Prometheus",
|
||||
"type": "main",
|
||||
"index": 0
|
||||
},
|
||||
{
|
||||
"node": "Check LM Studio",
|
||||
"type": "main",
|
||||
"index": 0
|
||||
},
|
||||
{
|
||||
"node": "Check Gateway",
|
||||
"type": "main",
|
||||
"index": 0
|
||||
},
|
||||
{
|
||||
"node": "Check Proxmox",
|
||||
"type": "main",
|
||||
"index": 0
|
||||
}
|
||||
]
|
||||
]
|
||||
},
|
||||
"Check Google DNS": {
|
||||
"main": [
|
||||
[
|
||||
{
|
||||
"node": "Aggregate Health Data",
|
||||
"type": "main",
|
||||
"index": 0
|
||||
}
|
||||
]
|
||||
]
|
||||
},
|
||||
"Check Cloudflare DNS": {
|
||||
"main": [
|
||||
[
|
||||
{
|
||||
"node": "Aggregate Health Data",
|
||||
"type": "main",
|
||||
"index": 0
|
||||
}
|
||||
]
|
||||
]
|
||||
},
|
||||
"Check Pi-hole .196": {
|
||||
"main": [
|
||||
[
|
||||
{
|
||||
"node": "Aggregate Health Data",
|
||||
"type": "main",
|
||||
"index": 0
|
||||
}
|
||||
]
|
||||
]
|
||||
},
|
||||
"Check Pi-hole .245": {
|
||||
"main": [
|
||||
[
|
||||
{
|
||||
"node": "Aggregate Health Data",
|
||||
"type": "main",
|
||||
"index": 0
|
||||
}
|
||||
]
|
||||
]
|
||||
},
|
||||
"Check Pi-hole .62": {
|
||||
"main": [
|
||||
[
|
||||
{
|
||||
"node": "Aggregate Health Data",
|
||||
"type": "main",
|
||||
"index": 0
|
||||
}
|
||||
]
|
||||
]
|
||||
},
|
||||
"Get Docker Services": {
|
||||
"main": [
|
||||
[
|
||||
{
|
||||
"node": "Aggregate Health Data",
|
||||
"type": "main",
|
||||
"index": 0
|
||||
}
|
||||
]
|
||||
]
|
||||
},
|
||||
"Get Swarm Nodes": {
|
||||
"main": [
|
||||
[
|
||||
{
|
||||
"node": "Aggregate Health Data",
|
||||
"type": "main",
|
||||
"index": 0
|
||||
}
|
||||
]
|
||||
]
|
||||
},
|
||||
"Check Komodo": {
|
||||
"main": [
|
||||
[
|
||||
{
|
||||
"node": "Aggregate Health Data",
|
||||
"type": "main",
|
||||
"index": 0
|
||||
}
|
||||
]
|
||||
]
|
||||
},
|
||||
"Check OpenWebUI": {
|
||||
"main": [
|
||||
[
|
||||
{
|
||||
"node": "Aggregate Health Data",
|
||||
"type": "main",
|
||||
"index": 0
|
||||
}
|
||||
]
|
||||
]
|
||||
},
|
||||
"Check Paperless": {
|
||||
"main": [
|
||||
[
|
||||
{
|
||||
"node": "Aggregate Health Data",
|
||||
"type": "main",
|
||||
"index": 0
|
||||
}
|
||||
]
|
||||
]
|
||||
},
|
||||
"Check Prometheus": {
|
||||
"main": [
|
||||
[
|
||||
{
|
||||
"node": "Aggregate Health Data",
|
||||
"type": "main",
|
||||
"index": 0
|
||||
}
|
||||
]
|
||||
]
|
||||
},
|
||||
"Check Gateway": {
|
||||
"main": [
|
||||
[
|
||||
{
|
||||
"node": "Aggregate Health Data",
|
||||
"type": "main",
|
||||
"index": 0
|
||||
}
|
||||
]
|
||||
]
|
||||
},
|
||||
"Check Proxmox": {
|
||||
"main": [
|
||||
[
|
||||
{
|
||||
"node": "Aggregate Health Data",
|
||||
"type": "main",
|
||||
"index": 0
|
||||
}
|
||||
]
|
||||
]
|
||||
},
|
||||
"Check LM Studio": {
|
||||
"main": [
|
||||
[
|
||||
{
|
||||
"node": "Aggregate Health Data",
|
||||
"type": "main",
|
||||
"index": 0
|
||||
}
|
||||
]
|
||||
]
|
||||
},
|
||||
"Aggregate Health Data": {
|
||||
"main": [
|
||||
[
|
||||
{
|
||||
"node": "AI Health Analysis",
|
||||
"type": "main",
|
||||
"index": 0
|
||||
}
|
||||
]
|
||||
]
|
||||
},
|
||||
"AI Health Analysis": {
|
||||
"main": [
|
||||
[
|
||||
{
|
||||
"node": "Build Final Report",
|
||||
"type": "main",
|
||||
"index": 0
|
||||
}
|
||||
]
|
||||
]
|
||||
},
|
||||
"Build Final Report": {
|
||||
"main": [
|
||||
[
|
||||
{
|
||||
"node": "Should Alert?",
|
||||
"type": "main",
|
||||
"index": 0
|
||||
}
|
||||
]
|
||||
]
|
||||
},
|
||||
"Should Alert?": {
|
||||
"main": [
|
||||
[
|
||||
{
|
||||
"node": "Format Alert Message",
|
||||
"type": "main",
|
||||
"index": 0
|
||||
}
|
||||
],
|
||||
[
|
||||
{
|
||||
"node": "Webhook Response",
|
||||
"type": "main",
|
||||
"index": 0
|
||||
}
|
||||
]
|
||||
]
|
||||
},
|
||||
"Format Alert Message": {
|
||||
"main": [
|
||||
[
|
||||
{
|
||||
"node": "Webhook Response",
|
||||
"type": "main",
|
||||
"index": 0
|
||||
}
|
||||
]
|
||||
]
|
||||
}
|
||||
},
|
||||
"active": false,
|
||||
"settings": {
|
||||
"executionOrder": "v1"
|
||||
},
|
||||
"versionId": "1",
|
||||
"meta": {
|
||||
"templateCredsSetupCompleted": true,
|
||||
"instanceId": "homelab"
|
||||
},
|
||||
"id": "homelab-health-monitor",
|
||||
"tags": []
|
||||
}
|
||||
288
services/n8n/workflows/homelab-integration-advisor.json
Normal file
288
services/n8n/workflows/homelab-integration-advisor.json
Normal file
@@ -0,0 +1,288 @@
|
||||
{
|
||||
"name": "Homelab Integration Advisor",
|
||||
"nodes": [
|
||||
{
|
||||
"parameters": {
|
||||
"rule": {
|
||||
"interval": [
|
||||
{
|
||||
"field": "days",
|
||||
"daysInterval": 1,
|
||||
"triggerAtHour": 9
|
||||
}
|
||||
]
|
||||
}
|
||||
},
|
||||
"id": "daily-trigger",
|
||||
"name": "Daily at 9 AM",
|
||||
"type": "n8n-nodes-base.scheduleTrigger",
|
||||
"typeVersion": 1.2,
|
||||
"position": [
|
||||
250,
|
||||
400
|
||||
]
|
||||
},
|
||||
{
|
||||
"parameters": {
|
||||
"httpMethod": "POST",
|
||||
"path": "integration-advisor",
|
||||
"responseMode": "responseNode",
|
||||
"options": {}
|
||||
},
|
||||
"id": "webhook-trigger",
|
||||
"name": "Manual Trigger",
|
||||
"type": "n8n-nodes-base.webhook",
|
||||
"typeVersion": 2,
|
||||
"position": [
|
||||
250,
|
||||
600
|
||||
],
|
||||
"webhookId": "integration-advisor"
|
||||
},
|
||||
{
|
||||
"parameters": {
|
||||
"command": "docker service ls --format '{{.Name}}|{{.Mode}}|{{.Replicas}}|{{.Image}}|{{.Ports}}'"
|
||||
},
|
||||
"id": "get-services",
|
||||
"name": "Get All Services",
|
||||
"type": "n8n-nodes-base.executeCommand",
|
||||
"typeVersion": 1,
|
||||
"position": [
|
||||
500,
|
||||
400
|
||||
]
|
||||
},
|
||||
{
|
||||
"parameters": {
|
||||
"url": "=http://prometheus:9090/api/v1/query?query=up",
|
||||
"options": {
|
||||
"timeout": 5000
|
||||
}
|
||||
},
|
||||
"id": "get-prometheus-metrics",
|
||||
"name": "Get Prometheus Metrics",
|
||||
"type": "n8n-nodes-base.httpRequest",
|
||||
"typeVersion": 4.2,
|
||||
"position": [
|
||||
500,
|
||||
550
|
||||
],
|
||||
"continueOnFail": true
|
||||
},
|
||||
{
|
||||
"parameters": {
|
||||
"url": "=http://lm-studio:1234/v1/models",
|
||||
"options": {}
|
||||
},
|
||||
"id": "get-ai-models",
|
||||
"name": "Get Available AI Models",
|
||||
"type": "n8n-nodes-base.httpRequest",
|
||||
"typeVersion": 4.2,
|
||||
"position": [
|
||||
500,
|
||||
700
|
||||
],
|
||||
"continueOnFail": true
|
||||
},
|
||||
{
|
||||
"parameters": {
|
||||
"jsCode": "const items = $input.all();\n\nconst inventory = {\n timestamp: new Date().toISOString(),\n services: [],\n capabilities: {\n ai: [],\n monitoring: [],\n automation: [],\n storage: [],\n productivity: [],\n media: [],\n development: []\n },\n integration_potential: []\n};\n\n// Parse service list\nconst serviceData = items.find(i => i.json.stdout);\nif (serviceData && serviceData.json.stdout) {\n const lines = serviceData.json.stdout.split('\\n').filter(l => l.trim());\n lines.forEach(line => {\n const [name, mode, replicas, image, ports] = line.split('|');\n const service = { name, mode, replicas, image, ports };\n inventory.services.push(service);\n \n // Categorize by capability\n if (name.includes('openwebui') || name.includes('lm-studio') || name.includes('ollama')) {\n inventory.capabilities.ai.push(name);\n } else if (name.includes('prometheus') || name.includes('grafana') || name.includes('alert')) {\n inventory.capabilities.monitoring.push(name);\n } else if (name.includes('n8n') || name.includes('komodo')) {\n inventory.capabilities.automation.push(name);\n } else if (name.includes('paperless') || name.includes('stirling') || name.includes('nextcloud')) {\n inventory.capabilities.productivity.push(name);\n } else if (name.includes('plex') || name.includes('jellyfin') || name.includes('immich')) {\n inventory.capabilities.media.push(name);\n } else if (name.includes('gitea') || name.includes('code-server')) {\n inventory.capabilities.development.push(name);\n } else if (name.includes('omv') || name.includes('samba')) {\n inventory.capabilities.storage.push(name);\n }\n });\n}\n\n// Get AI models\nconst aiModels = items.find(i => i.json.data);\nif (aiModels && aiModels.json.data) {\n inventory.ai_models = aiModels.json.data.map(m => m.id);\n}\n\n// Define integration opportunities\nconst integrations = [\n { from: 'n8n', to: 'paperless', type: 'Document automation', potential: 'high' },\n { from: 'n8n', to: 'prometheus', type: 'Metric-based triggers', potential: 'high' },\n { from: 'n8n', to: 'openwebui', type: 'AI-powered workflows', potential: 'high' },\n { from: 'openwebui', to: 'searxng', type: 'Enhanced search', potential: 'medium' },\n { from: 'prometheus', to: 'grafana', type: 'Visualization', potential: 'existing' },\n { from: 'gitea', to: 'komodo', type: 'CI/CD automation', potential: 'high' },\n { from: 'paperless', to: 'nextcloud', type: 'Document storage', potential: 'medium' },\n { from: 'immich', to: 'openwebui', type: 'Photo analysis', potential: 'medium' },\n { from: 'home-assistant', to: 'all', type: 'Smart home integration', potential: 'high' }\n];\n\ninventory.integration_potential = integrations.filter(i => {\n const fromExists = inventory.services.some(s => s.name.includes(i.from.split('-')[0]));\n const toExists = i.to === 'all' || inventory.services.some(s => s.name.includes(i.to.split('-')[0]));\n return fromExists && toExists;\n});\n\nreturn [{ json: inventory }];"
|
||||
},
|
||||
"id": "build-inventory",
|
||||
"name": "Build Service Inventory",
|
||||
"type": "n8n-nodes-base.code",
|
||||
"typeVersion": 2,
|
||||
"position": [
|
||||
750,
|
||||
500
|
||||
]
|
||||
},
|
||||
{
|
||||
"parameters": {
|
||||
"method": "POST",
|
||||
"url": "=http://lm-studio:1234/v1/chat/completions",
|
||||
"sendBody": true,
|
||||
"bodyParameters": {
|
||||
"parameters": [
|
||||
{
|
||||
"name": "model",
|
||||
"value": "=deepseek-r1-distill-llama-8b"
|
||||
},
|
||||
{
|
||||
"name": "messages",
|
||||
"value": "={{ [{\"role\":\"system\",\"content\":\"You are a homelab integration expert specializing in service orchestration with n8n, Docker, and modern DevOps tools. Analyze the provided service inventory and recommend specific integration workflows. For each recommendation provide: 1) Services involved 2) Integration type 3) Specific n8n workflow pattern 4) Expected benefits 5) Complexity (low/medium/high). Respond in JSON format with an array of recommendations.\"}, {\"role\":\"user\",\"content\":\"Analyze this homelab and recommend integration workflows:\\n\\nServices: \" + JSON.stringify($json.capabilities, null, 2) + \"\\n\\nAvailable AI Models: \" + JSON.stringify($json.ai_models || [], null, 2) + \"\\n\\nPotential Integrations Identified: \" + JSON.stringify($json.integration_potential, null, 2)}] }}"
|
||||
},
|
||||
{
|
||||
"name": "temperature",
|
||||
"value": "=0.4"
|
||||
},
|
||||
{
|
||||
"name": "max_tokens",
|
||||
"value": "=2000"
|
||||
}
|
||||
]
|
||||
},
|
||||
"options": {
|
||||
"timeout": 40000
|
||||
}
|
||||
},
|
||||
"id": "ai-integration-advisor",
|
||||
"name": "AI Integration Advisor",
|
||||
"type": "n8n-nodes-base.httpRequest",
|
||||
"typeVersion": 4.2,
|
||||
"position": [
|
||||
1000,
|
||||
500
|
||||
]
|
||||
},
|
||||
{
|
||||
"parameters": {
|
||||
"jsCode": "const inventory = $('Build Service Inventory').item.json;\nconst aiResponse = $json.choices[0].message.content;\n\nlet recommendations;\ntry {\n const jsonMatch = aiResponse.match(/\\{[\\s\\S]*\\}|\\[[\\s\\S]*\\]/);\n recommendations = jsonMatch ? JSON.parse(jsonMatch[0]) : { raw: aiResponse };\n} catch (e) {\n recommendations = { raw: aiResponse, error: e.message };\n}\n\nconst report = {\n generated_at: new Date().toISOString(),\n homelab_summary: {\n total_services: inventory.services.length,\n capabilities: inventory.capabilities,\n ai_models_available: inventory.ai_models?.length || 0\n },\n integration_opportunities: inventory.integration_potential,\n ai_recommendations: recommendations,\n priority_integrations: [],\n quick_wins: []\n};\n\n// Extract priority integrations from AI response\nif (Array.isArray(recommendations)) {\n report.priority_integrations = recommendations\n .filter(r => r.complexity === 'low' || r.complexity === 'medium')\n .slice(0, 5);\n report.quick_wins = recommendations\n .filter(r => r.complexity === 'low')\n .slice(0, 3);\n} else if (recommendations.recommendations) {\n report.priority_integrations = recommendations.recommendations.slice(0, 5);\n}\n\nreturn [{ json: report }];"
|
||||
},
|
||||
"id": "build-integration-report",
|
||||
"name": "Build Integration Report",
|
||||
"type": "n8n-nodes-base.code",
|
||||
"typeVersion": 2,
|
||||
"position": [
|
||||
1250,
|
||||
500
|
||||
]
|
||||
},
|
||||
{
|
||||
"parameters": {
|
||||
"respondWith": "json",
|
||||
"responseBody": "={{ $json }}"
|
||||
},
|
||||
"id": "webhook-response",
|
||||
"name": "Webhook Response",
|
||||
"type": "n8n-nodes-base.respondToWebhook",
|
||||
"typeVersion": 1,
|
||||
"position": [
|
||||
1500,
|
||||
500
|
||||
]
|
||||
}
|
||||
],
|
||||
"pinData": {},
|
||||
"connections": {
|
||||
"Daily at 9 AM": {
|
||||
"main": [
|
||||
[
|
||||
{
|
||||
"node": "Get All Services",
|
||||
"type": "main",
|
||||
"index": 0
|
||||
},
|
||||
{
|
||||
"node": "Get Prometheus Metrics",
|
||||
"type": "main",
|
||||
"index": 0
|
||||
},
|
||||
{
|
||||
"node": "Get Available AI Models",
|
||||
"type": "main",
|
||||
"index": 0
|
||||
}
|
||||
]
|
||||
]
|
||||
},
|
||||
"Manual Trigger": {
|
||||
"main": [
|
||||
[
|
||||
{
|
||||
"node": "Get All Services",
|
||||
"type": "main",
|
||||
"index": 0
|
||||
},
|
||||
{
|
||||
"node": "Get Prometheus Metrics",
|
||||
"type": "main",
|
||||
"index": 0
|
||||
},
|
||||
{
|
||||
"node": "Get Available AI Models",
|
||||
"type": "main",
|
||||
"index": 0
|
||||
}
|
||||
]
|
||||
]
|
||||
},
|
||||
"Get All Services": {
|
||||
"main": [
|
||||
[
|
||||
{
|
||||
"node": "Build Service Inventory",
|
||||
"type": "main",
|
||||
"index": 0
|
||||
}
|
||||
]
|
||||
]
|
||||
},
|
||||
"Get Prometheus Metrics": {
|
||||
"main": [
|
||||
[
|
||||
{
|
||||
"node": "Build Service Inventory",
|
||||
"type": "main",
|
||||
"index": 0
|
||||
}
|
||||
]
|
||||
]
|
||||
},
|
||||
"Get Available AI Models": {
|
||||
"main": [
|
||||
[
|
||||
{
|
||||
"node": "Build Service Inventory",
|
||||
"type": "main",
|
||||
"index": 0
|
||||
}
|
||||
]
|
||||
]
|
||||
},
|
||||
"Build Service Inventory": {
|
||||
"main": [
|
||||
[
|
||||
{
|
||||
"node": "AI Integration Advisor",
|
||||
"type": "main",
|
||||
"index": 0
|
||||
}
|
||||
]
|
||||
]
|
||||
},
|
||||
"AI Integration Advisor": {
|
||||
"main": [
|
||||
[
|
||||
{
|
||||
"node": "Build Integration Report",
|
||||
"type": "main",
|
||||
"index": 0
|
||||
}
|
||||
]
|
||||
]
|
||||
},
|
||||
"Build Integration Report": {
|
||||
"main": [
|
||||
[
|
||||
{
|
||||
"node": "Webhook Response",
|
||||
"type": "main",
|
||||
"index": 0
|
||||
}
|
||||
]
|
||||
]
|
||||
}
|
||||
},
|
||||
"active": false,
|
||||
"settings": {
|
||||
"executionOrder": "v1"
|
||||
},
|
||||
"versionId": "1",
|
||||
"meta": {
|
||||
"templateCredsSetupCompleted": true,
|
||||
"instanceId": "homelab"
|
||||
},
|
||||
"id": "homelab-integration-advisor",
|
||||
"tags": []
|
||||
}
|
||||
332
services/n8n/workflows/homelab-log-analyzer.json
Normal file
332
services/n8n/workflows/homelab-log-analyzer.json
Normal file
@@ -0,0 +1,332 @@
|
||||
{
|
||||
"name": "Homelab Log Analyzer",
|
||||
"nodes": [
|
||||
{
|
||||
"parameters": {
|
||||
"rule": {
|
||||
"interval": [
|
||||
{
|
||||
"field": "hours",
|
||||
"hoursInterval": 6
|
||||
}
|
||||
]
|
||||
}
|
||||
},
|
||||
"id": "schedule-trigger",
|
||||
"name": "Every 6 Hours",
|
||||
"type": "n8n-nodes-base.scheduleTrigger",
|
||||
"typeVersion": 1.2,
|
||||
"position": [
|
||||
250,
|
||||
400
|
||||
]
|
||||
},
|
||||
{
|
||||
"parameters": {
|
||||
"command": "docker service logs --tail 100 --timestamps traefik_traefik 2>&1 || echo 'Service not found'"
|
||||
},
|
||||
"id": "logs-traefik",
|
||||
"name": "Get Traefik Logs",
|
||||
"type": "n8n-nodes-base.executeCommand",
|
||||
"typeVersion": 1,
|
||||
"position": [
|
||||
500,
|
||||
200
|
||||
],
|
||||
"continueOnFail": true
|
||||
},
|
||||
{
|
||||
"parameters": {
|
||||
"command": "docker service logs --tail 100 --timestamps n8n_n8n 2>&1 || echo 'Service not found'"
|
||||
},
|
||||
"id": "logs-n8n",
|
||||
"name": "Get n8n Logs",
|
||||
"type": "n8n-nodes-base.executeCommand",
|
||||
"typeVersion": 1,
|
||||
"position": [
|
||||
500,
|
||||
350
|
||||
],
|
||||
"continueOnFail": true
|
||||
},
|
||||
{
|
||||
"parameters": {
|
||||
"command": "docker service logs --tail 100 --timestamps ai_openwebui 2>&1 || echo 'Service not found'"
|
||||
},
|
||||
"id": "logs-openwebui",
|
||||
"name": "Get OpenWebUI Logs",
|
||||
"type": "n8n-nodes-base.executeCommand",
|
||||
"typeVersion": 1,
|
||||
"position": [
|
||||
500,
|
||||
500
|
||||
],
|
||||
"continueOnFail": true
|
||||
},
|
||||
{
|
||||
"parameters": {
|
||||
"command": "docker service logs --tail 100 --timestamps infrastructure_komodo-core 2>&1 || echo 'Service not found'"
|
||||
},
|
||||
"id": "logs-komodo",
|
||||
"name": "Get Komodo Logs",
|
||||
"type": "n8n-nodes-base.executeCommand",
|
||||
"typeVersion": 1,
|
||||
"position": [
|
||||
500,
|
||||
650
|
||||
],
|
||||
"continueOnFail": true
|
||||
},
|
||||
{
|
||||
"parameters": {
|
||||
"command": "docker service logs --tail 100 --timestamps monitoring_prometheus 2>&1 || echo 'Service not found'"
|
||||
},
|
||||
"id": "logs-prometheus",
|
||||
"name": "Get Prometheus Logs",
|
||||
"type": "n8n-nodes-base.executeCommand",
|
||||
"typeVersion": 1,
|
||||
"position": [
|
||||
500,
|
||||
800
|
||||
],
|
||||
"continueOnFail": true
|
||||
},
|
||||
{
|
||||
"parameters": {
|
||||
"jsCode": "const items = $input.all();\n\nconst logAnalysis = {\n timestamp: new Date().toISOString(),\n services: [],\n errors: [],\n warnings: [],\n summary: {}\n};\n\nconst errorPatterns = [\n /ERROR/gi,\n /FATAL/gi,\n /CRITICAL/gi,\n /FAIL/gi,\n /panic:/gi,\n /exception/gi\n];\n\nconst warningPatterns = [\n /WARN/gi,\n /WARNING/gi,\n /deprecated/gi,\n /timeout/gi,\n /retry/gi\n];\n\nfor (const item of items) {\n const nodeName = item.json.node || 'unknown';\n const stdout = item.json.stdout || '';\n const lines = stdout.split('\\n').filter(l => l.trim());\n \n const serviceLog = {\n name: nodeName,\n totalLines: lines.length,\n errors: [],\n warnings: [],\n recentEntries: lines.slice(-10) // Last 10 lines\n };\n \n // Scan for errors and warnings\n lines.forEach(line => {\n const matchesError = errorPatterns.some(pattern => pattern.test(line));\n const matchesWarning = warningPatterns.some(pattern => pattern.test(line));\n \n if (matchesError) {\n const errorEntry = {\n service: nodeName,\n line: line,\n timestamp: line.match(/^\\d{4}-\\d{2}-\\d{2}T[\\d:]+\\.\\d+Z/) ? line.split(' ')[0] : null\n };\n serviceLog.errors.push(errorEntry);\n logAnalysis.errors.push(errorEntry);\n } else if (matchesWarning) {\n const warningEntry = {\n service: nodeName,\n line: line,\n timestamp: line.match(/^\\d{4}-\\d{2}-\\d{2}T[\\d:]+\\.\\d+Z/) ? line.split(' ')[0] : null\n };\n serviceLog.warnings.push(warningEntry);\n logAnalysis.warnings.push(warningEntry);\n }\n });\n \n logAnalysis.services.push(serviceLog);\n}\n\n// Generate summary\nlogAnalysis.summary = {\n totalServices: logAnalysis.services.length,\n totalErrors: logAnalysis.errors.length,\n totalWarnings: logAnalysis.warnings.length,\n servicesWithErrors: logAnalysis.services.filter(s => s.errors.length > 0).map(s => s.name),\n servicesWithWarnings: logAnalysis.services.filter(s => s.warnings.length > 0).map(s => s.name)\n};\n\nreturn [{ json: logAnalysis }];"
|
||||
},
|
||||
"id": "parse-logs",
|
||||
"name": "Parse and Analyze Logs",
|
||||
"type": "n8n-nodes-base.code",
|
||||
"typeVersion": 2,
|
||||
"position": [
|
||||
750,
|
||||
500
|
||||
]
|
||||
},
|
||||
{
|
||||
"parameters": {
|
||||
"method": "POST",
|
||||
"url": "=http://lm-studio:1234/v1/chat/completions",
|
||||
"sendBody": true,
|
||||
"bodyParameters": {
|
||||
"parameters": [
|
||||
{
|
||||
"name": "model",
|
||||
"value": "=qwen2.5-coder-7b-instruct"
|
||||
},
|
||||
{
|
||||
"name": "messages",
|
||||
"value": "={{ [{\"role\":\"system\",\"content\":\"You are a Docker/Kubernetes expert and log analyzer. Analyze these Docker service logs and identify: 1) Critical issues requiring immediate attention 2) Performance concerns 3) Configuration problems 4) Recommended actions. Respond in JSON format with: critical_issues (array), performance_concerns (array), config_issues (array), recommendations (array).\"}, {\"role\":\"user\",\"content\":\"Analyze these homelab service logs:\\n\\nSummary: \" + JSON.stringify($json.summary, null, 2) + \"\\n\\nErrors Found: \" + JSON.stringify($json.errors.slice(0, 20), null, 2) + \"\\n\\nWarnings Found: \" + JSON.stringify($json.warnings.slice(0, 20), null, 2)}] }}"
|
||||
},
|
||||
{
|
||||
"name": "temperature",
|
||||
"value": "=0.2"
|
||||
},
|
||||
{
|
||||
"name": "max_tokens",
|
||||
"value": "=1500"
|
||||
}
|
||||
]
|
||||
},
|
||||
"options": {
|
||||
"timeout": 30000
|
||||
}
|
||||
},
|
||||
"id": "ai-log-analysis",
|
||||
"name": "AI Log Analysis",
|
||||
"type": "n8n-nodes-base.httpRequest",
|
||||
"typeVersion": 4.2,
|
||||
"position": [
|
||||
1000,
|
||||
500
|
||||
]
|
||||
},
|
||||
{
|
||||
"parameters": {
|
||||
"jsCode": "const logData = $('Parse and Analyze Logs').item.json;\nconst aiResponse = $json.choices[0].message.content;\n\nlet aiAnalysis;\ntry {\n // Extract JSON from response (AI might wrap it in markdown)\n const jsonMatch = aiResponse.match(/\\{[\\s\\S]*\\}/);\n aiAnalysis = jsonMatch ? JSON.parse(jsonMatch[0]) : { raw: aiResponse };\n} catch (e) {\n aiAnalysis = { raw: aiResponse };\n}\n\nconst report = {\n generated_at: new Date().toISOString(),\n period: '6 hours',\n summary: logData.summary,\n top_errors: logData.errors.slice(0, 10),\n top_warnings: logData.warnings.slice(0, 10),\n ai_analysis: aiAnalysis,\n action_required: logData.summary.totalErrors > 10 || (aiAnalysis.critical_issues && aiAnalysis.critical_issues.length > 0)\n};\n\nreturn [{ json: report }];"
|
||||
},
|
||||
"id": "build-log-report",
|
||||
"name": "Build Log Report",
|
||||
"type": "n8n-nodes-base.code",
|
||||
"typeVersion": 2,
|
||||
"position": [
|
||||
1250,
|
||||
500
|
||||
]
|
||||
},
|
||||
{
|
||||
"parameters": {
|
||||
"conditions": {
|
||||
"options": {
|
||||
"leftValue": "",
|
||||
"caseSensitive": true,
|
||||
"typeValidation": "strict"
|
||||
},
|
||||
"combinator": "or",
|
||||
"conditions": [
|
||||
{
|
||||
"id": "has-action-required",
|
||||
"leftValue": "={{ $json.action_required }}",
|
||||
"rightValue": true,
|
||||
"operator": {
|
||||
"type": "boolean",
|
||||
"operation": "true"
|
||||
}
|
||||
},
|
||||
{
|
||||
"id": "many-errors",
|
||||
"leftValue": "={{ $json.summary.totalErrors }}",
|
||||
"rightValue": 5,
|
||||
"operator": {
|
||||
"type": "number",
|
||||
"operation": "gt"
|
||||
}
|
||||
}
|
||||
]
|
||||
}
|
||||
},
|
||||
"id": "should-alert-logs",
|
||||
"name": "Should Alert?",
|
||||
"type": "n8n-nodes-base.if",
|
||||
"typeVersion": 2,
|
||||
"position": [
|
||||
1500,
|
||||
500
|
||||
]
|
||||
}
|
||||
],
|
||||
"pinData": {},
|
||||
"connections": {
|
||||
"Every 6 Hours": {
|
||||
"main": [
|
||||
[
|
||||
{
|
||||
"node": "Get Traefik Logs",
|
||||
"type": "main",
|
||||
"index": 0
|
||||
},
|
||||
{
|
||||
"node": "Get n8n Logs",
|
||||
"type": "main",
|
||||
"index": 0
|
||||
},
|
||||
{
|
||||
"node": "Get OpenWebUI Logs",
|
||||
"type": "main",
|
||||
"index": 0
|
||||
},
|
||||
{
|
||||
"node": "Get Komodo Logs",
|
||||
"type": "main",
|
||||
"index": 0
|
||||
},
|
||||
{
|
||||
"node": "Get Prometheus Logs",
|
||||
"type": "main",
|
||||
"index": 0
|
||||
}
|
||||
]
|
||||
]
|
||||
},
|
||||
"Get Traefik Logs": {
|
||||
"main": [
|
||||
[
|
||||
{
|
||||
"node": "Parse and Analyze Logs",
|
||||
"type": "main",
|
||||
"index": 0
|
||||
}
|
||||
]
|
||||
]
|
||||
},
|
||||
"Get n8n Logs": {
|
||||
"main": [
|
||||
[
|
||||
{
|
||||
"node": "Parse and Analyze Logs",
|
||||
"type": "main",
|
||||
"index": 0
|
||||
}
|
||||
]
|
||||
]
|
||||
},
|
||||
"Get OpenWebUI Logs": {
|
||||
"main": [
|
||||
[
|
||||
{
|
||||
"node": "Parse and Analyze Logs",
|
||||
"type": "main",
|
||||
"index": 0
|
||||
}
|
||||
]
|
||||
]
|
||||
},
|
||||
"Get Komodo Logs": {
|
||||
"main": [
|
||||
[
|
||||
{
|
||||
"node": "Parse and Analyze Logs",
|
||||
"type": "main",
|
||||
"index": 0
|
||||
}
|
||||
]
|
||||
]
|
||||
},
|
||||
"Get Prometheus Logs": {
|
||||
"main": [
|
||||
[
|
||||
{
|
||||
"node": "Parse and Analyze Logs",
|
||||
"type": "main",
|
||||
"index": 0
|
||||
}
|
||||
]
|
||||
]
|
||||
},
|
||||
"Parse and Analyze Logs": {
|
||||
"main": [
|
||||
[
|
||||
{
|
||||
"node": "AI Log Analysis",
|
||||
"type": "main",
|
||||
"index": 0
|
||||
}
|
||||
]
|
||||
]
|
||||
},
|
||||
"AI Log Analysis": {
|
||||
"main": [
|
||||
[
|
||||
{
|
||||
"node": "Build Log Report",
|
||||
"type": "main",
|
||||
"index": 0
|
||||
}
|
||||
]
|
||||
]
|
||||
},
|
||||
"Build Log Report": {
|
||||
"main": [
|
||||
[
|
||||
{
|
||||
"node": "Should Alert?",
|
||||
"type": "main",
|
||||
"index": 0
|
||||
}
|
||||
]
|
||||
]
|
||||
}
|
||||
},
|
||||
"active": false,
|
||||
"settings": {
|
||||
"executionOrder": "v1"
|
||||
},
|
||||
"versionId": "1",
|
||||
"meta": {
|
||||
"templateCredsSetupCompleted": true,
|
||||
"instanceId": "homelab"
|
||||
},
|
||||
"id": "homelab-log-analyzer",
|
||||
"tags": []
|
||||
}
|
||||
Reference in New Issue
Block a user