I think this one’s getting downvoted by people who haven’t read the article. The argument proceeds that because llms respond like people with anxiety, depression, and ptsd, and because people with those conditions interact with llms, the llms are likely to intensify or exacerbate the symptoms in the humans that interact with them. The researchers weren’t trying to fix the llms through therapy.
This is so fucking dumb. All this is saying is that the researchers do not understand what LLMs actually are - that is, that they’re essentially just a bunch of markov chains layered on top of each other. They are not sentient or sapient.
Stop fucking anthropomorphizing LLMs
Have you considered the possibility that the kinds of researchers who publish in nature may have taken the time to do some basic research into how llms work before commissioning a study, and that may not be what’s happening here?
this post was submitted on 14 Jan 2026
0 points (50.0% liked)
science
23934 readers
152 users here now
A community to post scientific articles, news, and civil discussion.
rule #1: be kind
founded 2 years ago
MODERATORS