view the rest of the comments
Selfhosted
A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.
Rules:
-
Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.
-
No spam posting.
-
Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.
-
Don't duplicate the full text of your blog or github here. Just post the link for folks to click.
-
Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).
-
No trolling.
Resources:
- selfh.st Newsletter and index of selfhosted software and apps
- awesome-selfhosted software
- awesome-sysadmin resources
- Self-Hosted Podcast from Jupiter Broadcasting
Any issues on the community? Report it using the report flag.
Questions? DM the mods!
Be aware your backup is useless, if you don't backup nextclouds database when using a bucket as primary storage ☝🏻
I use nextcloud with local storage and I use rclone for backup to a S3 bucket (MinIO).
You can use rclone to directly backing up your bucket. There is no need for restic. It is simple like rsync
An example would be:
rclone sync b2:mybucket otherprovider:otherbucket
You can use local storage too:
rclone sync b2:mybucket /my/path/here
Rclone can mount and backup almost everything. It is a swiss knife and I love it.
Performance is really good with nextcloud, if configurated well. Look at this thread
Understood. My hope was to mount the bucket locally (ro) and have it backed up with the container backups using the built in borg backup option.
I'd prefer to have proper incremental backups not just a warm copy of the data.
It seems to be very capable but I cannot make it work for my purposes. I fought with rclone/aio for a few hours yesterday trying to make it work.
I was, quite easily, able to mount the B2 bucket to a local path. I used the --allow-other option to make it available to the whole system. Everything was accessible via the CLI, but the Nextcloud AIO admin refused to allow me to add that path to the backup job. I was unable to find any logs that indicated why. If I could get this working, I think, it would be ideal as the backups would be consistent.
I also tried using a couple of the serve options. The nfs option would launch but mounts would fail with protocol errors. I couldn't get the docker plugin to sync up properly with docker. I haven't tried the restic serve option yet. I can provide logs if requested.
Thanks for the help.
Did you try to mount your bucket on your host system via rclone?
rclone mount b2:bucket /path/on/host --daemon --vfs-cache-mode full
I would mount it on the host system and add an additional volume in your docker-compose.yml
You could give this a try, if you want to use it in your container 🤔
I did. That's where I ran into the problems.
I'm embarrassed to admit I didn't try this. I think I was too far into the weeds the other night. I'll give this a try.
If it works, I'm thinking I'll need to setup a systemd service to auto-mount the path on boot and set it as a dependency in docker so docker doesn't start before it?
Your way should work with a systemd service. I do it the same way but I just need one bucket mounted in my file system, so I don't need dependencies
I was able to get the rclone mount on boot via a systemd unit without much trouble.
I even managed to drag it kicking and screaming into a docker volume that I mounted as an external volume to the Nextcloud AIO stack. It still refused to allow me to add it as a backup directory.
I think I'm throwing in the towel with getting Nextcloud to back it up via the built in mechanism. I'll just schedule a separate job (cron/systemd) that runs shortly after the Nextcloud backup. It should be close enough for my purposes.