[-] potsnpans@beehaw.org 0 points 1 year ago

Thank you for this!

I have to say though, it's really interesting to see the reactions here, given the paper's findings. Because in the study, while people got better at spotting fake news after the game/test, they got worse at identifying real news, and overall more distrustful of news in general. I feel like that's on display here - with people (somewhat correctly) mistrusting the misleading article, but also (somewhat incorrectly) mistrusting the research behind it.

[-] potsnpans@beehaw.org 1 points 1 year ago

Hooo boy. This article is wildly misrepresenting both the study and it's findings.

  1. The study did not set out to test ability to judge real/fake news across demographic differences. The study itself was primarily looking to determine the validity of their test.
  2. Because of this, their validation sample is wildly different from the sample observed in the online "game" version. As in, the original sample vetted participants, and also removed any who failed an "attention check", neither of which were present in the second test.
  3. Demographics on the portion actually looking at age differences are... let's say biased. There are far more young participants, with only ~10% over 50. The vast majority (almost 90%!) were college educated. And the sample trended liberal to a significant degree.
  4. All the above suggests that the demographic most typically considered "bad" at spotting fake news (conservative boomers who didn't go to college) was massively underrepresented in the study. Which makes sense given that participation in that portion relies on largely unvetted volunteers to sign up to test their ability to spot fake news.

Most critically, the study itself does not claim that differences between these demographics are representative. That portion is looking at differences in the sample pool before/after the test, to examine its potential for "training" people to spot fake news (this had mixed results, which they acknowledge). This article, ironically, is spreading misinformation about the study itself, and doing the researchers and its readers a great disservice.

[-] potsnpans@beehaw.org 4 points 1 year ago

Largely, yeah, we just don't know. In research terms, it's still quite early, so anything definitive is likely years away. However, research is starting to indicate it may be neurological - a product of damage to the brain and nervous system, as this article discusses.

[-] potsnpans@beehaw.org 2 points 1 year ago

That's fair, though personally I'm kindof glad they did. "Signal is a secure messaging app" is a lot easier to explain to non-tech-savvy people than "Signal is a secure messaging app, as long as you are messaging someone who is using Signal too. It can also send regular texts but they can't be encrypted." Leaving that nuance out would have left people texting with a false assumption of security, but I lost several people explaining it because it "sounds complicated".

[-] potsnpans@beehaw.org 1 points 1 year ago

This exactly. Only I am quite certain it's already being used this way, on a much wider scale than we have any way to measure.

[-] potsnpans@beehaw.org 38 points 1 year ago

signal ftw ✊

[-] potsnpans@beehaw.org 11 points 1 year ago

I do think this is more an issue with science communication broadly than string theory specifically - every field has its own examples, and medicine is notorious for it - but she is right that scientific researchers (the subject matter experts) have a responsibility to accurately communicate their work when speaking to the public.

Its one thing for an enthusiast to inadvertently oversell a concept to the public as fact because they are excited and only understand at only a basic level. It's another entirely for someone who's been researching that concept for 30-40 years, with the express intent of proving or disproving its validity, to oversell it as fact when they're whole job is to be intimately familiar with its shortcomings. They, of all people, should know better - and that means they have a responsibility to do better.

Science does get messy, by design, but it is the duty of those who communicate their science to be honest about that messiness, not mask it by unfounded statements to sell their ideas to people that don't have the research expertise to spot the falsehoods.

potsnpans

joined 1 year ago