155
top 50 comments
sorted by: hot top controversial new old
[-] umbraroze@lemmy.world 23 points 2 weeks ago

I have no idea why the makers of LLM crawlers think it's a good idea to ignore bot rules. The rules are there for a reason and the reasons are often more complex than "well, we just don't want you to do that". They're usually more like "why would you even do that?"

Ultimately you have to trust what the site owners say. The reason why, say, your favourite search engine returns the relevant Wikipedia pages and not bazillion random old page revisions from ages ago is that Wikipedia said "please crawl the most recent versions using canonical page names, and do not follow the links to the technical pages (including history)". Again: Why would anyone index those?

[-] phoenixz@lemmy.ca 7 points 2 weeks ago

Because you are coming from the perspective of a reasonable person

These people are billionaires who expect to get everything for free. Rules are for the plebs, just take it already

[-] 4am@lemm.ee 20 points 2 weeks ago

Imagine how much power is wasted on this unfortunate necessity.

Now imagine how much power will be wasted circumventing it.

Fucking clown world we live in

[-] Demdaru@lemmy.world 5 points 2 weeks ago

On on hand, yes. On the other...imagine frustration of management of companies making and selling AI services. This is such a sweet thing to imagine.

load more comments (12 replies)
[-] digdilem@lemmy.ml 17 points 2 weeks ago

Surprised at the level of negativity here. Having had my sites repeatedly DDOSed offline by Claudebot and others scraping the same damned thing over and over again, thousands of times a second, I welcome any measures to help.

[-] AWittyUsername@lemmy.world 10 points 2 weeks ago

I think the negativity is around the unfortunate fact that solutions like this shouldn't be necessary.

[-] surph_ninja@lemmy.world 14 points 2 weeks ago

I’m imagining a sci-fi spin on this where AI generators are used to keep AI crawlers in a loop, and they accidentally end up creating some unique AI culture or relationship in the process.

[-] biofaust@lemmy.world 12 points 2 weeks ago

I guess this is what the first iteration of the Blackwall looks like.

[-] owl@infosec.pub 5 points 2 weeks ago

Gotta say "AI Labyrinth" sounds almost as cool.

[-] kandoh@reddthat.com 10 points 2 weeks ago

Burning 29 acres of rainforest a day to do nothing

[-] gmtom@lemmy.world 8 points 2 weeks ago

"I used the AI to destroy the AI"

[-] Fluke@lemm.ee 2 points 2 weeks ago

And consumed the power output of a medium country to do it.

Yeah, great job! 👍

[-] LeninOnAPrayer@lemm.ee 4 points 2 weeks ago* (last edited 2 weeks ago)

We truly are getting dumber as a species. We're facing climate change but running some of the most power hungry processers in the world to spit out cooking recipes and homework answers for millions of people. All to better collect their data to sell products to them that will distract them from the climate disaster our corporations have caused. It's really fun to watch if it wasn't so sad.

[-] cantstopthesignal@sh.itjust.works 2 points 2 weeks ago

We had to kill the internet, to save the internet.

load more comments (1 replies)
[-] AtomicHotSauce@lemmy.world 6 points 2 weeks ago

That's just BattleBots with a different name.

[-] aviationeast@lemmy.world 3 points 2 weeks ago

You're not wrong.

load more comments (4 replies)
[-] oldfart@lemm.ee 6 points 2 weeks ago

So the web is a corporate war zone now and you can choose feudal protection or being attacked from all sides. What a time to be alive.

load more comments (1 replies)
[-] RelativeArea1@sh.itjust.works 4 points 2 weeks ago* (last edited 2 weeks ago)

this is some fucking stupid situation, we somewhat got a faster internet and these bots messing each other are hogging the bandwidth.

load more comments (3 replies)
[-] peoplebeproblems@midwest.social 3 points 2 weeks ago

Not exactly how I expected the AI wars to go, but I guess since we're in a cyberpunk world, we take what we get

[-] rocket_dragon@lemmy.dbzer0.com 5 points 2 weeks ago

Next step is an AI that detects AI labyrinth.

It gets trained on labyrinths generated by another AI.

So you have an AI generating labyrinths to train an AI to detect labyrinths which are generated by another AI so that your original AI crawler doesn't get lost.

It's gonna be AI all the way down.

[-] brucethemoose@lemmy.world 2 points 2 weeks ago* (last edited 2 weeks ago)

LLMs tend to be really bad at detecting AI generated content. I can’t imagine specialized models are much better. For the crawler, it’s also exponentially more expensive and more human work, and must be replicated for every crawler since they’re so freaking secretive.

I think the hosts win here.

load more comments (27 replies)
[-] AnthropomorphicCat@lemmy.world 2 points 2 weeks ago

So the world is now wasting energy and resources to generate AI content in order to combat AI crawlers, by making them waste more energy and resources. Great! 👍

[-] brucethemoose@lemmy.world 1 points 2 weeks ago* (last edited 2 weeks ago)

The energy cost of inference is overstated. Small models, or “sparse” models like Deepseek are not expensive to run. Training is a one-time cost that still pales in comparison to, like, making aluminum.

Doubly so once inference goes more on-device.

Basically, only Altman and his tech bro acolytes want AI to be cost prohibitive so he can have a monopoly. Also, he’s full of shit, and everyone in the industry knows it.

AI as it’s implemented has plenty of enshittification, but the energy cost is kinda a red herring.

[-] quack@lemmy.zip 2 points 2 weeks ago* (last edited 2 weeks ago)

Generating content with AI to throw off crawlers. I dread to think of the resources we’re wasting on this utter insanity now, but hey who the fuck cares as long as the line keeps going up for these leeches.

load more comments
view more: next ›
this post was submitted on 21 Mar 2025
155 points (100.0% liked)

Technology

68401 readers
2317 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS