2

Anubis is designed to protect websites from AI scraper bots, Anubis primarily focuses on parameters like the user agent sent with the request and looks for oddities in the connection. “Known good” and harmless clients are always accepted, and “Known bad” clients are always denied. Now the same tool is used to get protection from a DDoS attack: https://fabulous.systems/posts/2025/05/anubis-saved-our-websites-from-a-ddos-attack/

#opensource #Linux #cybersecurity

you are viewing a single comment's thread
view the rest of the comments
[-] pa@zusammenhalt.de 1 points 2 weeks ago

@nixCraft@mastodon.social the JWT should be validated in the reverse proxy. Anubis would be a perfect authorization endpoint to point to when required. But I think the logic should be present in more server side frameworks. PHP for the masses, container image for classic anubis and embeddable libraries for rust or java services…

this post was submitted on 02 May 2025
2 points (100.0% liked)

Cybersecurity

2 readers
46 users here now

An umbrella community for all things cybersecurity / infosec. News, research, questions, are all welcome!

Rules

Community Rules

founded 2 years ago
MODERATORS