This is how I do it. It works internally and externally, though it's more than OP needs. :)
To add to what's been said (in case it's useful to others), it's worth looking at SWAG and Authelia to do the proxying for services visible to the Internet. I run them in a Docker container and it does all the proxying, takes care of the SSL certificate and auto-renews it, and adds MFA to the services you run that support it (all browsing, MFA-aware apps, etc).
Another thing I like about SWAG's setup is that you select which services/hostnames you want to expose, name them in the SUBDOMAINS environment variable in Docker (easy to remove one if you take a service down, for maintenance, etc), and then each has its own config file in Nginx's proxy-confs directory that does the https://name.domain -> http://IP:port
redirection for that service (e.g. wordpress.subdomain.conf), assuming the traffic has met whatever MFA and geo-whitelisting stuff you have set up.
I also have Cloudflare protecting the traffic (proxying the domain's A record and the wildcard CNAME) to my public address, which adds another layer.
I have zero problem with curated or algorithmic timelines. I have a 100% problem when there isn't a chronology timeline option.
It's simple really: give me the permanent option of chronological without the dark pattern fuckery of having to reset it periodically, or fuck off forever.
Yeah, it make for a nice workflow, doesn't it. It doesn't give you the "fully automated" achievement, but it's not much of a chore. :)
Have you considered something like borgbackup? It does good deduplication, so you won't have umpteen copies of unchanged files.
I use it mostly for my daily driver laptop to backup to my NAS, and the Gitlab CE container running on the NAS acts as the equivalent for its local Git repos, which are then straightforward to copy elsewhere. Though haven't got it scripting anything like bouncing containers or DB dumps.
Agreed. The lack of varied examples in documentation is my common tripping point. When I hate myself, I use visit ~~Sarcasm~~StackOverflow to find examples, and then reference those against the module's documentation.
And it's definitely become an easier process as I've read more documentation.
Do you have a NAS? It can be a good way to get decent functionality without extra hardware, especially if you're doing proof of concept or temporary stuff.
My self-hosting Docker setup is split between 12 permanent stacks on a Synology DS920+ NAS (with upgraded RAM) and 4 on a Raspberry Pi 4B, using Portainer and its agent on the Pi to manage them. The NAS is also using Synology's Drive (like Dropbox or GDrive) and Photos (like Google Photos).
I've had the NAS running servers for Valheim and VRising in the past, but they require that fewer containers be running, as game servers running on Linux usually have no optimisation and/or are emulating Windows.
If I decide to host a game server again, I'll probably look at a NUC. I've done the DIY mini-ITX route in the past (for an XBMC-based media centre with HDMI output) and it was great, so that's another option.
This is what I do. I find keeping 20-odd docker-compose files (almost always static content) backed up to be straightforward.
Each is configured to bring up/down the whole stack in the right order, so any Watchtower-triggered update is seamless. My Gotify container sends me an update every time one changes. I use Portainer to manage them across two devices, but that's just about convenience.
I disable Watchtower for twitchy containers, and handle them manually. For the rest, the only issue I've seen is if there's a major change in how the container/stack is built (a change in database, etc), but that's happened twice and I've been able to recover.
I used Linuxserver's Docker container of Dokuwiki when I migrated my notes from Evernote a few years ago. It was easy to setup and configure, has a number of plugins that further improve it, and it did the job really well.
I ended up migrating it all to Obsidian this year, as it serves my needs better, but otherwise I'd still be using Dokuwiki.
DNS-O-Matic (recommended by CloudFlare, among others) combined with SWAG and Authelia will handle dynamic DNS, reverse proxying, SSL certificates, and MFA. SWAG (nginx, Let's Encrypt and Certbot) and Authelia (MFA) run nicely in a 2 container Docker stack.
Mine have been running for ~18 months on my NAS, though I have a fixed IP so no longer use a DDNS provider.
"Webring".... good lord, that takes me back. :)
This is pretty much what I'm doing.
If sub.rehab shows 5 threadiverse alternatives for a subreddit, I'll have a look at each and often sub to 2-3 with the highest subscriber count/posts I like. I'll then monitor them over time to see which, if any, to unsub from.