854
lads (lemmy.world)
you are viewing a single comment's thread
view the rest of the comments
[-] grysbok@lemmy.sdf.org 8 points 2 days ago

I just looked at my log for this morning. 23% of my total requests were from the useragent GoogleOther. Other visitors include GPTBot, SemanticScholarBot, and Turnitin. That's the crawlers that are still trying after I've had Anubis on the site for over a month. It was much, much worse before, when they could crawl the site, instead of being blocked.

That doesn't include the bots that lie about being bots. Looking back at an older screenshot of a monitors---I don't have the logs themselves anymore---I seriously doubt I had 43,000 unique visitors using Windows per day in March.

[-] daniskarma@lemmy.dbzer0.com -2 points 2 days ago* (last edited 2 days ago)

Why would they request so many times a day the same data if the objective was AI model training. It makes zero sense.

Also google bots obeys robots.txt so they are easy to manage.

There may be tons of reasons google is crawling your website. From ad research to any kind of research. The only AI related use I can think of is RAG. But that would take some user requests aways because if the user got the info through the AI google response then they would not enter the website. I suppose that would suck for the website owner, but it won't drastically increase the number of requests.

But for training I don't see it, there's no need at all to keep constantly scraping the same web for model training.

[-] grysbok@lemmy.sdf.org 7 points 2 days ago* (last edited 2 days ago)

Like I said, [edit: at one point] Facebook requested my robots.txt multiple times a second. You've not convinced me that bot writers care about efficiency.

[edit: they've since stopped, possibly because now I give a 404 to anything claiming to be from facebook]

[-] Quill7513@slrpnk.net 3 points 2 days ago

You've not convinced me that bot writers care about efficiency.

and why should bot writers care about efficiency when what they really care about is time. they'll burn all your resources without regard simply because they're not who's paying

[-] grysbok@lemmy.sdf.org 4 points 2 days ago

Yep, they'll just burn taxpayer resources (me and my poor servers) because it's not like they pay taxes anyway (assuming they are either a corporation or not based in the same locality as I am).

There's only one of me and if I'm working on keeping the servers bare minimum functional today I'm not working on making something more awesome for tomorrow. "Linux sysadmin" is only supposed to be up to 30% of my job.

[-] grysbok@lemmy.sdf.org 3 points 2 days ago* (last edited 2 days ago)

I mean, I enjoy linux sysadmining, but fighting bots takes time, experimentation, and research, and there's other stuff I should be doing. For example, accessibility updates to our websites. But, accessibility doesn't matter a lick if you can't access the website anyway due to timeouts.

this post was submitted on 13 Aug 2025
854 points (98.2% liked)

Programmer Humor

25726 readers
1228 users here now

Welcome to Programmer Humor!

This is a place where you can post jokes, memes, humor, etc. related to programming!

For sharing awful code theres also Programming Horror.

Rules

founded 2 years ago
MODERATORS