This makes me genuinely curious, who thought that would be a good idea?
It feels like a lot of "contribution" to open source suddenly is fueled by AI hype. Is it a LinkedIn/TikTok "trick" that is being amplified that somehow one will get a very well paid job at a BigTech company if they somehow have a lot of contributions on popular projects?
Where does that this trend actually come from?
Did anybody doing so ever bother checking contribution guidelines to see which tasks should actually be prioritized and if so with which tools?
This seems like a recurring pattern so it's not a random idea someone had.
A "normal" download is sending a file from 1 computer (e.g. server) to 1 other computer (e.g. client).
An example of that would be an HTTP server like the one we are both using now, e.g. you (client) visit lemmy.ml (server) and it sends you back the
index.htmlpage your browser requests. That is a great solution when you have a page that must dynamically be updated and broadcast back the new information to plenty of clients.BitTorrent is a protocol like HTTP but instead of having 1 computer sending to many other computers, ALL computers send the part they have, ALL computers request the part they are still missing. That's a different solution for a different problem, namely when a file is large enough and stable enough (does not change) that all the overhead is worthwhile.
So seeding implies having enough upstream bandwidth in order to help others who are still have missing parts. Note that most BitTorrent clients already have useful seeding defaults. Typically they'll let you seed (namely share file parts) even after you have downloaded everything up to a positive ration, e.g. 2/1 meaning that you will keep on sharing until you have uploaded about twice more than you downloaded.
Hope that helps!