Github is owned by Microsoft, so don't worry, it's going to get worse
This going to fuck over obtanium?
I honestly don't really see the problem here. This seems to mostly be targeting scrapers.
For unauthenticated users you are limited to public data only and 60 requests per hour, or 30k if you're using Git LFS. And for authenticated users it's 60k/hr.
What could you possibly be doing besides scraping that would hit those limits?
You might behind a shared IP with NAT or CG-NAT that shares that limit with others, or might be fetching files from raw.githubusercontent.com as part of an update system that doesn't have access to browser credentials, or Git cloning over https:// to avoid having to unlock your SSH key every time, or cloning a Git repo with submodules that separately issue requests. An hour is a long time. Imagine if you let uBlock Origin update filter lists, then you git clone something with a few modules, and so does your coworker and now you're blocked for an entire hour.
I hit those many times when signed out just scrolling through the code. The front end must be sending off tonnes of background requests
This doesn't include any requests from the website itself
I see the "just create an account" and "just login" crowd have joined the discussion. Some people will defend a monopolist no matter what. If github introduced ID checks à la Google or required a Microsoft account to login, they'd just shrug and go "create a Microsoft account then, stop bitching". They don't realise they are being boiled and don't care. Consoomer behaviour.
Open source repositories should rely on p2p. Torrenting repos is the way I think.
Not only for this. At any point m$ could take down your repo if they or their investors don't like it.
I wonder if it would already exist and if it could work with git?
Git is p2p and distributed from day 1. Github is just a convenient website. If Microsoft takes down your repo, just upload to another system. Nothing but convenience will be lost.
Not entirely true. You lose tickets and PRs in that scenario.
I've heard git-bug a few times for decentralised issue tracking, never tried it but the idea is interesting
The project's official repo should probably exist in a single location so that there is an authoritative version. At that point p2p is only necessary if traffic for the source code is getting too expensive for the project.
Personally I think the source hut model is closest to the ideal set up for OSS projects. Though I use Codeberg for my personal stuff because I'm cheap and lazy
I'm wary of external dependencies. They are cool now, but will they be cool in the future? Will they even exist?
One thing I think p2p excels is resiliance. People be still using eDonkey even if it's abandoned.
A repo signature should deal with "fake copies". It's true we have the problem that BitTorrent protocol is not though for updating files, so a different protocol would be needed. I don't even know how possible/practical it is. It's true that any big project should probably host their own remote repo, and copy it on other platforms as needed. Github only repos was always a dangerous practice.
It's true we have the problem that BitTorrent protocol is not though for updating files
Bittorrent v2 has updatable torrents
If you're able to easily migrate issues etc to a new instance, then you don't need to worry about a particular service providers getting shitty. At which point your main concern is temporary outages.
Perhaps this is more of a concern for some projects (e.g. anything that angers Nintendo's lawyers). But for most, I imagine that the added complexity of distributed p2p hosting would outweigh the upsides.
Not saying it's a bad idea, in fact I like it a lot, but I can see why it's not a high priority for most OSS devs
Its always blocked me from searching in firefox when I'm logged out for some reason.
Probably getting hammered by ai scrapers
you mean, doin' what microsoft and their ai 'partners' do to others?
Yeah but they're allowed to do it because they have brazillions of dollars.
They literally own GitHub. Brazillions well spent.
The funny thing is that rate limits won't help them with genai scrapers
If Microsoft knows how to do one thing well, it’s killing a successful product.
Wow so surprising, never saw this coming, this is my surprised face. :-l
Programming
Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!
Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.
Hope you enjoy the instance!
Rules
Rules
- Follow the programming.dev instance rules
- Keep content related to programming in some way
- If you're posting long videos try to add in some form of tldr for those who don't want to watch videos
Wormhole
Follow the wormhole through a path of communities !webdev@programming.dev