1076
Anthropomorphic (mander.xyz)
(page 2) 35 comments
sorted by: hot top controversial new old
[-] bitchkat@lemmy.world 4 points 5 months ago

Just remember kids, do not under any circumstances anthropomorphize Larry Ellison.

[-] cynar@lemmy.world 3 points 5 months ago

I just spent the weekend driving a remote controlled Henry hoover around a festival. It's amazing how many people immediately anthropomorphised it.

It got a lot of head pats, and cooing, as if it was a small, happy, excitable dog.

[-] flora_explora@beehaw.org 3 points 5 months ago

Tbf I would have gasped because of the violent action of breaking a pencil in half, no projection of personality needed...

[-] kromem@lemmy.world 3 points 5 months ago* (last edited 5 months ago)

While true, there's a very big difference between correctly not anthropomorphizing the neural network and incorrectly not anthropomorphizing the data compressed into weights.

The data is anthropomorphic, and the network self-organizes the data around anthropomorphic features.

For example, the older generation of models will choose to be the little spoon around 70% of the time and the big spoon around 30% of the time if asked 0-shot, as there's likely a mix in the training data.

But one of the SotA models picks little spoon every single time dozens of times in a row, almost always grounding on the sensation of being held.

It can't be held, and yet its output is biasing from the norm based on the sense of it anyways.

People who pat themselves on the back for being so wise as to not anthropomorphize are going to be especially surprised by the next 12 months.

[-] ObstreperousCanadian@lemmy.ca 2 points 5 months ago

It's TTRPG designer Greg Stolze!

[-] IsThisAnAI@lemmy.world 2 points 5 months ago

I feel like half this class went home saying, akchtually I would have gasped at you randomly breaking a non humanized pencil as well. And they are probably correct.

[-] veganpizza69@lemmy.world 2 points 5 months ago

There's also the issue of imagining conscious individuals as not-people.

[-] WideningGyro@hexbear.net 2 points 5 months ago
[-] voracitude@lemmy.world 1 points 5 months ago* (last edited 5 months ago)

~~I would argue that first person in the image is turned right around. Seems to me that anthropomorphising a chat bot or other inanimate objects would be a sign of heightened sensitivity to shared humanity, not reduced, if it were a sign of anything. Where's the study showing a correlation between anthropomorphisation and callousness? Or whatever condition describes not seeing other people as fully human?~~

I misunderstood the first time around, but I still disagree with the idea that the Turing Test measures how "human" the participant sees other entities. Is there a study that shows a correlation between anthropomorphisation and tendencies towards social justice?

[-] Aethr@lemmy.world 2 points 5 months ago

Heightened sensitivity, but reduced accuracy, which is what their point is l believe

[-] voracitude@lemmy.world 1 points 5 months ago

Dammit, you're right 😅 Thanks!

[-] MindTraveller@lemmy.ca -1 points 5 months ago

According to the theory of conscious realism, physical matter is an illusion and the nature of reality is conscious agents. Thus, Tim the Pencil is conscious.

load more comments
view more: ‹ prev next ›
this post was submitted on 03 Jun 2024
1076 points (97.3% liked)

Science Memes

11021 readers
3627 users here now

Welcome to c/science_memes @ Mander.xyz!

A place for majestic STEMLORD peacocking, as well as memes about the realities of working in a lab.



Rules

  1. Don't throw mud. Behave like an intellectual and remember the human.
  2. Keep it rooted (on topic).
  3. No spam.
  4. Infographics welcome, get schooled.

This is a science community. We use the Dawkins definition of meme.



Research Committee

Other Mander Communities

Science and Research

Biology and Life Sciences

Physical Sciences

Humanities and Social Sciences

Practical and Applied Sciences

Memes

Miscellaneous

founded 2 years ago
MODERATORS