477
Lavalamp too hot (discuss.tchncs.de)
submitted 2 days ago* (last edited 2 days ago) by swiftywizard@discuss.tchncs.de to c/programmer_humor@programming.dev
you are viewing a single comment's thread
view the rest of the comments
[-] Feathercrown@lemmy.world 7 points 1 day ago* (last edited 1 day ago)

Hmm, interesting theory. However:

  1. We know this is an issue with language models, it happens all the time with weaker ones - so there is an alternative explanation.

  2. LLMs are running at a loss right now, the company would lose more money than they gain from you - so there is no motive.

[-] jerkface@lemmy.ca 3 points 1 day ago

it was proposed less as a hypothesis about reality than as virtue signalling (in the original sense)

[-] MotoAsh@piefed.social 1 points 12 hours ago* (last edited 12 hours ago)

No, it wasn't a virtue signal, you fucking dingdongs.

Capitalism is rife with undercooked products, because getting a product out there starts the income flowing sooner. They don't have to be making a profit for a revenue stream to make sense.

Fuck, it's like all you idiots can do is project your lack of understanding on others...

[-] MotoAsh@piefed.social -2 points 1 day ago

Of course there's a technical reason for it, but they have incentive to try and sell even a shitty product.

[-] Feathercrown@lemmy.world 1 points 1 day ago

I don't think this really addresses my second point.

[-] MotoAsh@piefed.social 1 points 12 hours ago

How does it not? This isn't a fucking debate. How would artificially bloating the number of tokens they sell not help their bottom line?

this post was submitted on 25 Jan 2026
477 points (97.4% liked)

Programmer Humor

28818 readers
726 users here now

Welcome to Programmer Humor!

This is a place where you can post jokes, memes, humor, etc. related to programming!

For sharing awful code theres also Programming Horror.

Rules

founded 2 years ago
MODERATORS