51

cross-posted from: https://piefed.world/post/374427

paywall bypass: https://archive.is/whVMI

the study the article is about: https://www.thelancet.com/journals/langas/article/PIIS2468-1253(25)00133-5/abstract

article text:

AI Eroded Doctors’ Ability to Spot Cancer Within Months in Study

By Harry Black

August 12, 2025 at 10:30 PM UTC

Artificial intelligence, touted for its potential to transform medicine, led to some doctors losing skills after just a few months in a new study.

AI helped health professionals to better detect pre-cancerous growths in the colon, but when the assistance was removed, their ability to find tumors dropped by about 20% compared with rates before the tool was ever introduced, according to findings published Wednesday.

Health-care systems around the world are embracing AI with a view to boosting patient outcomes and productivity. Just this year, the UK government announced £11 million ($14.8 million) in funding for a new trial to test how AI can help catch breast cancer earlier.

The AI in the study probably prompted doctors to become over-reliant on its recommendations, “leading to clinicians becoming less motivated, less focused, and less responsible when making cognitive decisions without AI assistance,” the scientists said in the paper.

They surveyed four endoscopy centers in Poland and compared detection success rates three months before AI implementation and three months after. Some colonoscopies were performed with AI and some without, at random. The results were published in The Lancet Gastroenterology and Hepatology journal.

Yuichi Mori, a researcher at the University of Oslo and one of the scientists involved, predicted that the effects of de-skilling will “probably be higher” as AI becomes more powerful.

What’s more, the 19 doctors in the study were highly experienced, having performed more than 2,000 colonoscopies each. The effect on trainees or novices might be starker, said Omer Ahmad, a consultant gastroenterologist at University College Hospital London.

“Although AI continues to offer great promise to enhance clinical outcomes, we must also safeguard against the quiet erosion of fundamental skills required for high-quality endoscopy,” Ahmad, who wasn’t involved in the research, wrote a comment alongside the article.

A study conducted by MIT this year raised similar concerns after finding that using OpenAI’s ChatGPT to write essays led to less brain engagement and cognitive activity.

you are viewing a single comment's thread
view the rest of the comments
[-] RedWizard@hexbear.net 11 points 2 days ago

I'm not sure that this is an apt comparison, since the laparoscopy tools would still present themselves to the surgeon as "the instrument, which the worker animates and makes into his organ with his skill and strength, and whose handling therefore depends on his virtuosity." Since laparoscopy isn't an autonomous system.

These AI systems take the entire process of identifying cancer and automate it. The doctors in this position are no longer required to have this knowledge since the AI "possesses [the] skill and strength in place of the [doctor]" becoming the "virtuoso". Under our capitalist system, this leaves little incentive to continue the process (given mass adoption of the technology) of expending capital on the training necessary. "Moreover, it must be remembered that the more simple, the more easily learned the work is, so much the less is its cost to production, the expense of its acquisition, and so much the lower must the wages sink – for, like the price of any other commodity, they are determined by the cost of production."

Obviously, this is only one task among many tasks the specialist performs, and there will still be a need for the whole of the specialist's skills. It does, however, produce worse outcomes if, say, the specialist is moved to a facility that lacks this technology after a significant amount of time relying on it. This isn't an issue for laparoscopy in "developed" countries; it is a nearly ubiquitous technology, making the skills of surgeons trained in laparoscopy very portable.

There is ultimately still a net positive here, since these models can be more accurate than humans at identifying cancer. It, however, is another illustration of the cognitive impact AI has on people who engage with it regularly. It illustrates that the sublimation process described in Capital also applies to these AI systems, as machines in the labor process.

Since laparoscopy isn't an autonomous system.

These AI systems take the entire process of identifying cancer and automate it

It is effectively the same thing, controlling tools with a controller is automating numerous processes that they will 10000% not be able to perform as deftly with their hands even if the steps performed are exactly the same

And, again, is recognizing cancer in a scan the doctor's primary function, or is it knowing how to treat it once its presence is established? You could outsource all the radiography to another human being and still have the same outcome of "the doctor isn't as good at recognizing it anymore." There's a cost benefit analysis to be done of is it better for a doctor to spend a lot of time looking at these scans, and be better at looking at them as a result, or for them to do other shit with their time

[-] RedWizard@hexbear.net 8 points 2 days ago* (last edited 2 days ago)

It is effectively the same thing, controlling tools with a controller is automating numerous processes that they will 10000% not be able to perform as deftly with their hands even if the steps performed are exactly the same

It's not effectively the same thing at all. One is an entirely new skill (liproscopy); the other is the elimination of an entire skill (AI detection of cancer). The laproscopy does nothing at all unless the surgen is there to operate it, and the use of laproscopy still demands the previous skills required to perform surgery in the first place.

You could outsource all the radiography to another human being and still have the same outcome of "the doctor isn't as good at recognizing it anymore."

You wouldn't need to offload the entire process to another human being; you would simply eliminate that human from the labor force. In your scenario, there is still a human with the skill to identify cancer, whereas the AI process begs to have positions eliminated, potentially leaving no one available for that task. The obvious issue with that is leaving the task fully in the hands of a black box, owned and operated by a for-profit corporation, whose insentives are dictated by the mechanics of capitalism and not the hipocratic oath or some other human-centered demand.

Regardless, it would seem you appear to have ignored the part of my comment that states:

Obviously, this is only one task among many tasks the specialist performs, and there will still be a need for the whole of the specialist's skills. It does, however, produce worse outcomes if, say, the specialist is moved to a facility that lacks this technology after a significant amount of time relying on it. This isn't an issue for laparoscopy in "developed" countries; it is a nearly ubiquitous technology, making the skills of surgeons trained in laparoscopy very portable. [...] There is ultimately still a net positive here, since these models can be more accurate than humans at identifying cancer."

And that comes with the huge caviate that @7bicycles@hexbear.net points out in his comment:

This is too small scale. It is the healthcare systems jobs to find tumors, but that's bigger than a doctor. Therein lies the problem, like I'm sure AI can do a good enough - even better - job of it than the guy who's looking at your X-Rays after his 49 hour shift, no doubt. But that shit should be run by a nonprofit or government agency which is definitely not the case here. Otherwise, there's the very real possibility of market capture. They do a bang up job of it until every clinic doesn't know how to do it by hand anymore, jack up the prices or make the services worse. This needs to be open source and run by something that is not legally profit oriented.

It's not effectively the same thing at all. One is an entirely new skill (liproscopy); the other is the elimination of an entire skill (AI detection of cancer).

No, it's the same damn thing. Laparoscopic surgery uses controllers and robots and shit and a surgeon who does it all the time instead of traditional surgery is going to lose skill in performing surgery in the same way that someone relying on automation to parse radiographic test results would lose their ability to read them properly

Like do you really think someone playing COD is going to retain skills with a gun like come on dawg

You wouldn't need to offload the entire process to another human being

It was a direct comparison to the use of AI for this purpose in general. The "other human being" is effectively the AI and giving them the task of parsing radiographic test results would do the exact same damn thing to the doctor in this case, diminish their ability to read it themselves.

And that comes with the huge caviate that @7bicycles@hexbear.net points out in his comment:

Do you need to steal someone else's good point to make an argument? I didn't address it then because it's an entirely different argument ("this is bad because of capitalism") that i agreed with

this post was submitted on 13 Aug 2025
51 points (100.0% liked)

technology

23912 readers
247 users here now

On the road to fully automated luxury gay space communism.

Spreading Linux propaganda since 2020

Rules:

founded 5 years ago
MODERATORS