125
submitted 1 month ago by cum_hoc@lemmy.world to c/science@lemmy.world

From the article:

This chatbot experiment reveals that, contrary to popular belief, many conspiracy thinkers aren't 'too far gone' to reconsider their convictions and change their minds.

you are viewing a single comment's thread
view the rest of the comments
[-] some_guy@lemmy.sdf.org 29 points 1 month ago

The researchers think a deep understanding of a given theory is vital to tackling errant beliefs. "Canned" debunking attempts, they argue, are too broad to address "the specific evidence accepted by the believer," which means they often fail. Because large language models like GPT-4 Turbo can quickly reference web-based material related to a particular belief or piece of "evidence," they mimic an expert in that specific belief; in short, they become a more effective conversation partner and debunker than can be found at your Thanksgiving dinner table or heated Discord chat with friends.

This is great news. The emotional labor needed to talk these people down is emotionally and mentally damaging. Offloading it to software is a great use of the technology that has real value.

this post was submitted on 17 Sep 2024
125 points (95.0% liked)

science

14595 readers
348 users here now

A community to post scientific articles, news, and civil discussion.

rule #1: be kind

<--- rules currently under construction, see current pinned post.

2024-11-11

founded 1 year ago
MODERATORS