847
lads
(lemmy.world)
Welcome to Programmer Humor!
This is a place where you can post jokes, memes, humor, etc. related to programming!
For sharing awful code theres also Programming Horror.
What's this about?
Anubis is a simple anti-scraper defense that weighs a web client's soul by giving it a tiny proof-of-work workload (some calculation that doesn't have an efficient solution, like cryptography) before letting it pass through to the actual website. The workload is insignificant for human users, but very taxing for high-volume scrapers. The calculations are done on the client's side using Javascript code.
(edit) For clarification: this works because the computation workload takes a relatively long time, not because it bogs down the CPU. Halting each request at the gate for only a few seconds adds up very quickly.
Recently, the FSF published an article that likened Anubis to malware because it's basically arbitrary code that the user has no choice but to execute:
Here's the article, and here's aussie linux man talking about it.
fwiw Anubis is working on a more respectful update, this was their first pass solution for what was basically a break glass emergency. i understand FSF's concern, but Anubis is the only thing that's making a free and open internet remotely possible right now, and far better it that nightmare fuel like cloudflare
Well, that's a typically abstract, to-the-letter take on the definition of software freedom from them. I think the practical necessity of doing something like this, especially for services like Invidious that are at risk, and the fact it's a harmless nonsense calculation really deserves an exception.
How did I know exactly who you were talking about before clicking the link?
The outro song played in my head...
But they can still scrape it, it just costs them computation?
Correct. Anubis' goal is to decrease the web traffic that hits the server, not to prevent scraping altogether. I should also clarify that this works because it costs the scrapers time with each request, not because it bogs down the CPU.
Why not then just make it a setTimeout or something so that it doesn't nuke the CPU of old devices?
Crawlers don't have to follow conventions or specifications. If one has a
setTimeout
implementation that doesn't wait the specified amount of time and simply executes the callback immediately, it defeats the system. Proof-of-work is meant to ensure that it's impossible to get around the time factor because of computational inefficiency.Anubis is an emergency solution against the flood of scrapers deployed by massive AI companies. Everybody wishes it wasn't necessary.
Beautiful