6
top 2 comments
sorted by: hot top controversial new old
[-] French75@slrpnk.net 6 points 3 days ago

Headline is super misleading... the article says that chat gpt told him it couldn't give him drug advice, and that he should seek help. He goactually got good advice from chat gpt, but didn't like it, didn't trust the good advice chat gpt gave him, then spent months trying to get chat gpt to give him the dodgy advice he wanted.

Of course chat GPT shouldn't be giving that sort of advice, but man that headline is as misleading as it gets. He literally didn't trust the advice he got from chat gpt to seek help.

[-] Shanmugha@lemmy.world 1 points 3 days ago

Come on. I can get my hands on some robotic parts, connect them, program them to cut my head off. Is the technology at fault here? Or is it better to ask why the fuck there is no mandate for the llm to report this to and for that institution to react, coupled with severe punishments if either doesn't?

this post was submitted on 17 Jan 2026
6 points (80.0% liked)

Hacker News

3479 readers
497 users here now

Posts from the RSS Feed of HackerNews.

The feed sometimes contains ads and posts that have been removed by the mod team at HN.

founded 1 year ago
MODERATORS