845
lads (lemmy.world)
you are viewing a single comment's thread
view the rest of the comments
[-] InFerNo@lemmy.ml 5 points 1 day ago

But they can still scrape it, it just costs them computation?

[-] rtxn@lemmy.world 13 points 1 day ago* (last edited 1 day ago)

Correct. Anubis' goal is to decrease the web traffic that hits the server, not to prevent scraping altogether. I should also clarify that this works because it costs the scrapers time with each request, not because it bogs down the CPU.

[-] Xylight@lemdro.id 1 points 1 day ago

Why not then just make it a setTimeout or something so that it doesn't nuke the CPU of old devices?

[-] rtxn@lemmy.world 1 points 22 hours ago

Crawlers don't have to follow conventions or specifications. If one has a setTimeout implementation that doesn't wait the specified amount of time and simply executes the callback immediately, it defeats the system. Proof-of-work is meant to ensure that it's impossible to get around the time factor because of computational inefficiency.

Anubis is an emergency solution against the flood of scrapers deployed by massive AI companies. Everybody wishes it wasn't necessary.

this post was submitted on 13 Aug 2025
845 points (98.2% liked)

Programmer Humor

25699 readers
1388 users here now

Welcome to Programmer Humor!

This is a place where you can post jokes, memes, humor, etc. related to programming!

For sharing awful code theres also Programming Horror.

Rules

founded 2 years ago
MODERATORS