265
submitted 22 hours ago* (last edited 22 hours ago) by chaospatterns@lemmy.world to c/programming@programming.dev

An update from GitHub: https://github.com/orgs/community/discussions/159123#discussioncomment-13148279

The rates are here: https://docs.github.com/en/rest/using-the-rest-api/rate-limits-for-the-rest-api?apiVersion=2022-11-28

  • 60 req/hour for unauthenticated users
  • 5000 req/hour for authenticated - personal
  • 15000 req/hour for authenticated - enterprise org
top 50 comments
sorted by: hot top controversial new old
[-] midori_matcha@lemmy.world 28 points 6 hours ago

Github is owned by Microsoft, so don't worry, it's going to get worse

[-] DoucheBagMcSwag@lemmy.dbzer0.com 8 points 5 hours ago

This going to fuck over obtanium?

[-] Lv_InSaNe_vL@lemmy.world 14 points 6 hours ago* (last edited 6 hours ago)

I honestly don't really see the problem here. This seems to mostly be targeting scrapers.

For unauthenticated users you are limited to public data only and 60 requests per hour, or 30k if you're using Git LFS. And for authenticated users it's 60k/hr.

What could you possibly be doing besides scraping that would hit those limits?

[-] chaospatterns@lemmy.world 10 points 4 hours ago* (last edited 4 hours ago)

You might behind a shared IP with NAT or CG-NAT that shares that limit with others, or might be fetching files from raw.githubusercontent.com as part of an update system that doesn't have access to browser credentials, or Git cloning over https:// to avoid having to unlock your SSH key every time, or cloning a Git repo with submodules that separately issue requests. An hour is a long time. Imagine if you let uBlock Origin update filter lists, then you git clone something with a few modules, and so does your coworker and now you're blocked for an entire hour.

[-] Disregard3145@lemmy.world 2 points 3 hours ago

I hit those many times when signed out just scrolling through the code. The front end must be sending off tonnes of background requests

[-] Lv_InSaNe_vL@lemmy.world 4 points 3 hours ago

This doesn't include any requests from the website itself

[-] onlinepersona@programming.dev 30 points 12 hours ago

I see the "just create an account" and "just login" crowd have joined the discussion. Some people will defend a monopolist no matter what. If github introduced ID checks à la Google or required a Microsoft account to login, they'd just shrug and go "create a Microsoft account then, stop bitching". They don't realise they are being boiled and don't care. Consoomer behaviour.

Anti Commercial-AI license

[-] daniskarma@lemmy.dbzer0.com 9 points 11 hours ago

Open source repositories should rely on p2p. Torrenting repos is the way I think.

Not only for this. At any point m$ could take down your repo if they or their investors don't like it.

I wonder if it would already exist and if it could work with git?

[-] Natanael@infosec.pub 3 points 3 hours ago* (last edited 3 hours ago)
[-] thenextguy@lemmy.world 8 points 8 hours ago

Git is p2p and distributed from day 1. Github is just a convenient website. If Microsoft takes down your repo, just upload to another system. Nothing but convenience will be lost.

[-] witten@lemmy.world 5 points 6 hours ago

Not entirely true. You lose tickets and PRs in that scenario.

[-] QuazarOmega@lemy.lol 1 points 3 hours ago

I've heard git-bug a few times for decentralised issue tracking, never tried it but the idea is interesting

[-] samc@feddit.uk 7 points 11 hours ago

The project's official repo should probably exist in a single location so that there is an authoritative version. At that point p2p is only necessary if traffic for the source code is getting too expensive for the project.

Personally I think the source hut model is closest to the ideal set up for OSS projects. Though I use Codeberg for my personal stuff because I'm cheap and lazy

[-] daniskarma@lemmy.dbzer0.com 4 points 11 hours ago

I'm wary of external dependencies. They are cool now, but will they be cool in the future? Will they even exist?

One thing I think p2p excels is resiliance. People be still using eDonkey even if it's abandoned.

A repo signature should deal with "fake copies". It's true we have the problem that BitTorrent protocol is not though for updating files, so a different protocol would be needed. I don't even know how possible/practical it is. It's true that any big project should probably host their own remote repo, and copy it on other platforms as needed. Github only repos was always a dangerous practice.

[-] Revan343@lemmy.ca 2 points 4 hours ago

It's true we have the problem that BitTorrent protocol is not though for updating files

Bittorrent v2 has updatable torrents

[-] samc@feddit.uk 2 points 9 hours ago

If you're able to easily migrate issues etc to a new instance, then you don't need to worry about a particular service providers getting shitty. At which point your main concern is temporary outages.

Perhaps this is more of a concern for some projects (e.g. anything that angers Nintendo's lawyers). But for most, I imagine that the added complexity of distributed p2p hosting would outweigh the upsides.

Not saying it's a bad idea, in fact I like it a lot, but I can see why it's not a high priority for most OSS devs

load more comments (12 replies)
[-] brachiosaurus@mander.xyz 7 points 11 hours ago

I have a question: why do lemmy dev keep using microsoft github?

load more comments (1 replies)
[-] irelephant@programming.dev 14 points 15 hours ago

Its always blocked me from searching in firefox when I'm logged out for some reason.

[-] traches@sh.itjust.works 128 points 22 hours ago

Probably getting hammered by ai scrapers

[-] adarza@lemmy.ca 84 points 21 hours ago
[-] rickyrigatoni@lemm.ee 6 points 9 hours ago

Yeah but they're allowed to do it because they have brazillions of dollars.

[-] Ugurcan@lemmy.world 1 points 3 hours ago

They literally own GitHub. Brazillions well spent.

[-] db0@lemmy.dbzer0.com 11 points 15 hours ago

The funny thing is that rate limits won't help them with genai scrapers

load more comments (1 replies)
[-] hackeryarn@lemmy.world 81 points 21 hours ago

If Microsoft knows how to do one thing well, it’s killing a successful product.

load more comments (9 replies)
[-] ozoned@piefed.social 19 points 17 hours ago

Wow so surprising, never saw this coming, this is my surprised face. :-l

load more comments
view more: next ›
this post was submitted on 14 May 2025
265 points (99.3% liked)

Programming

20172 readers
656 users here now

Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!

Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.

Hope you enjoy the instance!

Rules

Rules

  • Follow the programming.dev instance rules
  • Keep content related to programming in some way
  • If you're posting long videos try to add in some form of tldr for those who don't want to watch videos

Wormhole

Follow the wormhole through a path of communities !webdev@programming.dev



founded 2 years ago
MODERATORS