91
top 15 comments
sorted by: hot top controversial new old
[-] dyathinkhesaurus@lemmy.world 17 points 1 week ago

Bromide sedatives vanished from the US market by 1989, after the Food and Drug Administration banned them

So Robert Fucken Kennedy Jr will be all over this one shortly. Expect more psychosis I guess...

[-] breezeblock@lemmy.ca 13 points 1 week ago

We’re too stupid as a race to survive…

[-] finitebanjo@lemmy.world 8 points 1 week ago

If things like this keep happening that problem might solve itself.

[-] FelixCress@lemmy.world 12 points 1 week ago

Darwins natural selection.

[-] TheBat@lemmy.world 10 points 1 week ago

Is this a Breaking Bad reference?

[-] DandomRude@lemmy.world 10 points 1 week ago

ChatGPT is allowed, but chocolate eggs with toys inside are supposed to be too dangerous? 🤔

[-] grue@lemmy.world 2 points 1 week ago

I am literally just making this up on the spot and have no evidence for it, but I'm starting to wonder if maybe the real reason for the kinder egg ban was anticompetitive lobbying by Hershey or something like that, and the toy thing was just the excuse.

[-] Psaldorn@lemmy.world 1 points 1 week ago

Both are dumbasses swallowing anything

[-] RoidingOldMan@lemmy.world 4 points 1 week ago

Yikes. Imagine if he could sue for that.

[-] ExLisper@lemmy.curiana.net 3 points 1 week ago

Can we just skip to the point where LLMs are just a bunch of preapproved answers and legal warnings? Maybe we could then hire experts to answer some of the questions? Organize it like some sort of a message board. Let people search previous questions. Wait...

[-] NutWrench@lemmy.ml 2 points 1 week ago

An early AI was once asked, "Bob has a headache. What should Bob do?" And the AI replied, "Bob should cut off his own head."

The point being: AIs will give you logical solutions to your problems but they won't always give you practical ones.

[-] krunklom@lemmy.zip 2 points 1 week ago

except they won't always give you logical answers.

[-] Kornblumenratte@feddit.org 1 points 1 week ago

That's a misconception how LLMs work. It is how SF-authors imagined AI would work.

LLMs won't give you logical solutions to your problems — they'll give you the essence of the data they were fed with that are statistically likely to be associated with the words you used to prompt them. And since they are usually trained on the enshittified internet, well, you get what you paid for.

[-] Allero@lemmy.today 1 points 1 week ago

60-year-old man who had a "history of studying nutrition in college" decided to try a health experiment: He would eliminate all chlorine from his diet

Oh well, it started before ChatGPT even had a chance to make it worse.

[-] Dagwood_Sanwich@lemmy.world 1 points 1 week ago

Just wait until AI "partners" are a thing. Person: Honey, can you cook for me? AI: Sure thing. [proceeds to add highly toxic ingredients into the food]

this post was submitted on 08 Aug 2025
91 points (100.0% liked)

Not The Onion

17644 readers
331 users here now

Welcome

We're not The Onion! Not affiliated with them in any way! Not operated by them in any way! All the news here is real!

The Rules

Posts must be:

  1. Links to news stories from...
  2. ...credible sources, with...
  3. ...their original headlines, that...
  4. ...would make people who see the headline think, “That has got to be a story from The Onion, America’s Finest News Source.”

Please also avoid duplicates.

Comments and post content must abide by the server rules for Lemmy.world and generally abstain from trollish, bigoted, or otherwise disruptive behavior that makes this community less fun for everyone.

And that’s basically it!

founded 2 years ago
MODERATORS