833
Help. (mander.xyz)
you are viewing a single comment's thread
view the rest of the comments
[-] mitch@piefed.mitch.science 53 points 1 day ago

FWIW, this is why AI researchers have been screeching for decades not to create an AI that is anthropomorphized. It is already an issue we have with animals, now we are going to add a confabulation engine to the ass-end?

[-] uuldika@lemmy.ml 9 points 22 hours ago* (last edited 21 hours ago)

LLMs are trained on human writing, so they'll always be fundamentally anthropomorphic. you could fine-tune them to sound more clinical, but it's likely to make them worse at reasoning and planning.

for example, I notice GPT5 uses "I" a lot, especially saying things like "I need to make a choice" or "my suspicion is." I think that's actually a side effect of the RL training they've done to make it more agentic. having some concept of self is necessary when navigating an environment.

philosophical zombies are no longer a thought experiment.

[-] Jankatarch@lemmy.world 15 points 1 day ago

Yeah apparently even Eliza messed up with people back in the day and that's not even an LLM.

[-] Feathercrown@lemmy.world 19 points 1 day ago

I'm starting to realize how easily fooled people are by this stuff. The average person cannot be this stupid, and yet, they are.

[-] Xerxos@lemmy.ml 2 points 11 hours ago

The average IQ is 100. That is not a lot and half of the population is below that. I'm more surprised how bad our education system is in filtering out the dumb people. Someone who is 'not smart' but has good memory and is diligent can make it frighteningly far in our society. Not to mention nepo babies who are a different kind of problem

[-] slaneesh_is_right@lemmy.org 14 points 1 day ago

I was once in a restaurant and behind me was a group of 20 something year old people. Overheard someone asking something like:"so what are y'alls thoughts about VR? (This was just before the whole AI boom.) And one guy said:"ith's kind of scary to think about." I was super confused at that point, and they talked about how they heard people disappear in the cyberspace and people not knowing what's real and what's just VR.

I don't think they were stupid, but they formed a very strong opinion about something they clearly didn't know anything about.

[-] bitchkat@lemmy.world 1 points 4 hours ago

Sounds like they watched South Park and couldn't tell that it's not real.

[-] LemmyThinkAboutIt@lemmy.zip 2 points 8 hours ago

Without hearing the actual conversation, I feel like maybe he was just having trouble describing his thoughts about it. I take it as "disappearing into cyberspace" to mean someone becoming addicted to VR that they don't want to leave whatever virtual reality they're in. And possibly using it so much that the lines between reality and virtual reality become blurred. Or the guy really just thinks people get sucked into cyberspace, I really don't know with people anymore.

[-] Feathercrown@lemmy.world 3 points 16 hours ago

I don’t think they were stupid, but they formed a very strong opinion about something they clearly didn’t know anything about.

That's a subcategory of being stupid to be fair

[-] tomatoely@sh.itjust.works 10 points 1 day ago

I'd like to believe he heard a summary of sword art online's plot and thought it was real

[-] Jankatarch@lemmy.world 5 points 1 day ago

Wait it's not? But matrix!

[-] Cethin@lemmy.zip 5 points 23 hours ago

People have this issue with video game characters who don't even pretend to have intelligence. This could only go wrong.

[-] Buddahriffic@lemmy.world 3 points 1 day ago

Personally, I hate the idea of not doing something because there's idiots out there who will fuck themselves up on it. The current gen of AI might be a waste of resources and the whole concept of the goal of AI might be incompatible with society's existence; those are good reasons to at least be cautious about AI.

I don't think people wanting to have relationships with an AI is a good reason to stop it, especially considering that it might even be a good option for some people who would otherwise just have no one or maybe too many cats for them to care for. Consider the creepy stalker type that thinks liking someone or something gives them ownership over that person or thing. Better for them to be obsessed with an LLM they can't hurt than a real person they might (or will make uncomfortable even of they end up being harmless overall).

this post was submitted on 14 Aug 2025
833 points (98.9% liked)

Science Memes

16250 readers
2606 users here now

Welcome to c/science_memes @ Mander.xyz!

A place for majestic STEMLORD peacocking, as well as memes about the realities of working in a lab.



Rules

  1. Don't throw mud. Behave like an intellectual and remember the human.
  2. Keep it rooted (on topic).
  3. No spam.
  4. Infographics welcome, get schooled.

This is a science community. We use the Dawkins definition of meme.



Research Committee

Other Mander Communities

Science and Research

Biology and Life Sciences

Physical Sciences

Humanities and Social Sciences

Practical and Applied Sciences

Memes

Miscellaneous

founded 2 years ago
MODERATORS