60 req/hour for unauthenticated users
That's low enough that it may cause problems for a lot of infrastructure. Like, I'm pretty sure that the MELPA emacs package repository builds out of git, and a lot of that is on github.
60 req/hour for unauthenticated users
That's low enough that it may cause problems for a lot of infrastructure. Like, I'm pretty sure that the MELPA emacs package repository builds out of git, and a lot of that is on github.
That’s low enough that it may cause problems for a lot of infrastructure.
Likely the point. If you need more, get an API key.
I didn't think of that - also for nvim you typically pull plugins from git repositories
Do you think any infrastructure is pulling that often while unauthenticated? It seems like an easy fix either way (in my admittedly non devops opinion)
It's gonna be problematic in particular for organisations with larger offices. If you've got hundreds of devs/sysadmins under the same public IP address, those 60 requests/hour are shared between them.
Basically, I expect unauthenticated pulls to not anymore be possible at my day job, which means repos hosted on GitHub become a pain.
Same problem for CGNAT users
If I’m using Ansible or something to pull images it might get that high.
Of course the fix is to pull it once and copy the files over, but I could see this breaking prod for folks who didn’t write it that way in the first place
Crazy how many people think this is okay, yet left Reddit cause of their API shenanigans. GitHub is already halfway to requiring signing in to view anything like Twitter (X).
That' just how the cookie crumbles.
No no, no no no no, no no no no, no no there's no limit
Until there will be.
I think people are grossly underestimating the sheer size and significance of the issue at hand. Forgejo will very likely eventually get to the same point Github is at right now, and will have to employ some of the same safeguards.
Except Forgejo is open source and you can run your own instance of it. I do, and it's great.
That's a very accurate statement which has absolutely nothing to do with what I've said. Fact of the matter stands, is that those who generally seek to use a Github alternative do so because they dislike Microsoft or closed source platforms. Which is great, but those platforms with hosted instances see an overwhelmingly significant portion of users who visit because they choose not to selfhost. It's a lifecycle.
By step 30 you're selling everyone's data and pushing resource restrictions because it's expensive to run a popular service that's generally free. That doesn't change simply because people can selfhost if they want.
Dude, this is cool!
It works really well too. I have an instance.
No, no limits, we'll reach for the skyyyy
LOL!!!! RIP GitHub
EDIT: trying to compile any projects from source that use git submodules will be interesting. eg ROCm has more than 60 submodules to pull in 💀
The Go module system pulls dependencies from their sources. This should be interesting.
Even if you host your project on a different provider, many libraries are on github. All those unauthenticated Arch users trying to install Go-based software that pulls dependencies from github.
How does the Rust module system work? How does pip?
For Rust, as I understand, crates.io hosts a copy of the source code. It is possible to specify a Git repository directly as a dependency, but apparently, you cannot do that if you publish to crates.io.
So, it will cause pain for some devs, but the ecosystem at large shouldn't implode.
already not looking forward to the next updates on a few systems.
Yeah this could very well kill some package managers. Without some real hard heavy lifting.
scoop relies on git repos to work (scoop.sh - windows package manager)
The enshittification begins (continues?)...
just now? :)
Good thing git is “federated” by default.
& then you have fossil which is github in a box
Just browsing GitHub I've got this limit
i've hit it many times so far.. even as quick as the second page view (first internal link clicked) after more than a day or two since the last visit (yes, even with cleaned browser data or private window).
it's fucking stupid how quick they are to throw up a roadblock.
THIS is why I clone all my commonly used Repos to my personal gitea instance.
That's actually kind of an interesting idea.
Is there a reasonable way that I could host my own ui that will keep various repos. I care about cloned and always up to date automatically?
I recently switched my instance from gitea to forgejo because everyone said to do it and it was easy to do.
is authenticated like when you use a private key with git clone? stupid question i know
also this might be terrible if you subscribe to filter lists on raw github in ublock or adguard
This is specific to the GH REST API I think, not operations like doing a git clone to copy a repo to local machine, etc.
These changes will apply to operations like cloning repositories over HTTPS, anonymously interacting with our REST APIs, and downloading files from raw.githubusercontent.com.
Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!
Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.
Hope you enjoy the instance!
Rules
Follow the wormhole through a path of communities !webdev@programming.dev