view the rest of the comments
Selfhosted
A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.
Rules:
-
Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.
-
No spam posting.
-
Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.
-
Don't duplicate the full text of your blog or github here. Just post the link for folks to click.
-
Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).
-
No trolling.
Resources:
- selfh.st Newsletter and index of selfhosted software and apps
- awesome-selfhosted software
- awesome-sysadmin resources
- Self-Hosted Podcast from Jupiter Broadcasting
Any issues on the community? Report it using the report flag.
Questions? DM the mods!
S3 storage is simpler than local files? I think you need to elaborate
S3 storage is simpler than running scp -r to a remote node, because you can copy files to S3 in a massively parallel way and scp is generally sequential. It's very easy to protect the API too, as it's just HTTP (and at it, it's also significantly faster than WebDAV).
Nobody should be using SCP, use rsync.
S3 goes beyond the scope you describe. You disqualify yourself with such statements
Clearly I mean Garage in here when I write "S3." It is significantly easier and faster to run
hugo deploy
and let it talk to Garage, then to figure out where on a remote node the nginx k8s pod has its data PV mounted and scp files into it. Yes, I could automate that. Yes, I could pin the blog's pod to a single node. Yes, I could use a stable host path for that and use rsync, and I could skip the whole kubernetes insanity for a static html blog.But I somewhat enjoy poking the tech and yes, using Garage makes deploys faster and it provides me a stable well-known API endpoint for both data transfers and for serving the content, with very little maintenance required to make it work.
I don't follow. S3 is an AWS service that these tools emulate locally by providing the same API. But I'm happy to accept that there's just some misunderstanding 😃
In the context of my comments here, any mention of "S3" means "S3-compatible" in the way that's implemented by Garage. I hope that clarifies it for you.
Thank you
It is simpler when you’re doing stuff on the web and/or need to scale.
I disagree. Local files access is always superior. If you disagree, your target solution is likely poor to begin with
This is such a poor attempt at trolling. Don’t you have better things to do?
Bro, I'm an AWS Cloud Solution Architect and I seriously don't know what you're talking about. And, no, when I waste time on Lemmy, then there is literally nothing better to do.
AWS made S3. People built software to integrate S3 as a storage backend. Other people didn't want to do AWS, and built single-node imitations of the S3 service. Now you use those services and think that is S3, while it is only a crude replica of what S3 really is. At this point the S3 API is redundant and you could just as well store your assets close to your application. You have no real, global S3 delivery service anyway. What's the point?
Most people misuse AWS S3. Using stuff like minio is even more misguided.