Automation for the sake of automation?
I'm the type of person who sees a manual step and immediately starts thinking about how it can be automated. You can always do a little more.
My homelab runs a number of Docker containers that serve different functions for my network - DNS, reverse proxy, ad blocking, monitoring, etc. I've been using Diun for a long time to monitor images and get a ping on a private Discord server when updates are available. I chose this approach after running Portainer for a while, which in turn ran Watchtower to update images automatically. Portainer had high overhead and features I didn't need.
When I've gone through what needs to be updated, I simply ran docker compose pull && docker compose up -d over SSH.
There were several manual steps involved, and I wondered if I couldn't automate the flow. I already had a private Git repo on GitHub to document my homelab, so my mind immediately went to creating Pull Requests that I could easily approve and then push the update to the server.
Automation
After summer 2025, I had done substantial work with Claude Code to document my network, both with descriptions and Infrastructure as Code principles. This gave me a good foundation to look at more automated CI/CD flows.
I started by looking at replacements for Diun, where I found Renovate which is available as a GitHub plugin - it's free and designed to monitor repository dependencies and create issues/pull requests when there's reason to do so.
The flow I was looking at:
-
Renovate scans my dependencies and creates PRs when updates are available
-
I review the PR and merge if I want to do the update
-
The update runs - here there were 2 options:
- GitHub Actions with a self-hosted runner running Ansible
- GitHub Webhook that calls a service at my place that runs the script
I first tried a self-hosted runner, but it required SSH access between machines and yet another service to maintain. The webhook variant felt simpler.
I found adnanh/webhook which meant I didn't need to write my own webhook interface to run shell commands. I combined it with a Cloudflare Tunnel to expose the service to GitHub without opening ports.
It worked pretty well, but after a few days I realized I hadn't saved any steps - I was still reviewing each update manually. Instead of running docker compose pull, I now had a complex flow with multiple services that all needed maintenance.
In the end, I went back to Diun and manual updates. Sometimes the journey is more important than the destination. The tools I tested are good and serve their purposes! But for me, the simpler flow is the right choice for now.
References
| Tool | Description |
|---|---|
| Diun | Monitors Docker images and notifies on updates |
| Portainer | GUI for Docker management |
| Watchtower | Automatic Docker container updates |
| Renovate | Scans dependencies and creates PRs automatically |
| adnanh/webhook | Simple webhook server that runs shell commands |
| Cloudflare Tunnel | Exposes local services without opening ports |
| Ansible | Infrastructure as Code and configuration management |
| GitHub Actions | CI/CD with support for self-hosted runners |
| Claude Code | AI assistant for coding and documentation |