cross-posted from: https://piefed.world/post/374427
paywall bypass: https://archive.is/whVMI
the study the article is about: https://www.thelancet.com/journals/langas/article/PIIS2468-1253(25)00133-5/abstract
article text:
AI Eroded Doctors’ Ability to Spot Cancer Within Months in Study
By Harry Black
August 12, 2025 at 10:30 PM UTC
Artificial intelligence, touted for its potential to transform medicine, led to some doctors losing skills after just a few months in a new study.
AI helped health professionals to better detect pre-cancerous growths in the colon, but when the assistance was removed, their ability to find tumors dropped by about 20% compared with rates before the tool was ever introduced, according to findings published Wednesday.
Health-care systems around the world are embracing AI with a view to boosting patient outcomes and productivity. Just this year, the UK government announced £11 million ($14.8 million) in funding for a new trial to test how AI can help catch breast cancer earlier.
The AI in the study probably prompted doctors to become over-reliant on its recommendations, “leading to clinicians becoming less motivated, less focused, and less responsible when making cognitive decisions without AI assistance,” the scientists said in the paper.
They surveyed four endoscopy centers in Poland and compared detection success rates three months before AI implementation and three months after. Some colonoscopies were performed with AI and some without, at random. The results were published in The Lancet Gastroenterology and Hepatology journal.
Yuichi Mori, a researcher at the University of Oslo and one of the scientists involved, predicted that the effects of de-skilling will “probably be higher” as AI becomes more powerful.
What’s more, the 19 doctors in the study were highly experienced, having performed more than 2,000 colonoscopies each. The effect on trainees or novices might be starker, said Omer Ahmad, a consultant gastroenterologist at University College Hospital London.
“Although AI continues to offer great promise to enhance clinical outcomes, we must also safeguard against the quiet erosion of fundamental skills required for high-quality endoscopy,” Ahmad, who wasn’t involved in the research, wrote a comment alongside the article.
A study conducted by MIT this year raised similar concerns after finding that using OpenAI’s ChatGPT to write essays led to less brain engagement and cognitive activity.
Not medically related but educators I know are telling me that students becoming entirely dependent on AI is resulting in an issue that’s far worse than just forgetting a second language if you don’t use it. It’s like they’re forgetting how to think, or how to organize their thoughts, and make independent assessments.
I mean, those are still mental skills, even if we typically consider them so ubiquitous that they are thought of as typical function. I know children with particularly overbearing parents have previously had issues with independent assessments due to not having to, or sometimes being alone to make their own decisions. If you sit there and go, "grok is this true?" For every single idea you hear and take the cyber prophet at its word I wouldn't be surprised if you essentially cook your brain, as most critical function are still skills that can weaken and atrophy.
If they weren't, education wouldn't be able improve assessment abilities. It's worse because the skills are far more critical to function, I can live without my second language at the end of the day and it probably won't have catastrophic consequences.
The second language thing was the quickest analogy I could think of, something closer is how post-covid, educators I know mentioned students were far less socially capable as during isolation they essentially missed building some social skills and those they did have atrophied, leaving them relatively socially incompetent compared to students that predated COVID lockdown.
There is a similarity for students with LLM use (also just noting hear were probably talking very differently technologies in this example than what the doctors are using, AI is being used as an obfuscating term that is technically true. We're talking about the harm caused by bullet trains and lifted diesel pickup trucks like they're the same exact thing because they both transport you somewhere), they don't build some of the skill needed if they've been using it since it's gone mainstream, (3ish years) and those skills they did have are probably weakened compared to where they were before.