389
you are viewing a single comment's thread
view the rest of the comments
[-] Strawberry@lemmy.blahaj.zone 5 points 1 week ago

The bots scrape costly endpoints like the entire edit histories of every page on a wiki. You can't always just cache every possible generated page at the same time.

[-] jagged_circle@feddit.nl -2 points 1 week ago* (last edited 1 week ago)

Of course you can. This is why people use CDNs.

Put the entire site on a CDN with a cache of 24 hours for unauthenticated users.

this post was submitted on 20 Mar 2025
389 points (99.7% liked)

Open Source

35359 readers
226 users here now

All about open source! Feel free to ask questions, and share news, and interesting stuff!

Useful Links

Rules

Related Communities

Community icon from opensource.org, but we are not affiliated with them.

founded 5 years ago
MODERATORS