184
submitted 8 months ago by FlappyBubble@lemmy.ml to c/privacy@lemmy.ml

As a medical doctor I extensively use digital voice recorders to document my work. My secretary does the transcription. As a cost saving measure the process is soon intended to be replaced by AI-powered transcription, trained on each doctor's voice. As I understand it the model created is not being stored locally and I have no control over it what so ever.

I see many dangers as the data model is trained on biometric data and possibly could be used to recreate my voice. Of course I understand that there probably are other recordings on the Internet of me, enough to recreate my voice, but that's beside the point. Also the question is about educating them, not a legal one.

How do I present my case? I'm not willing to use a non local AI transcribing my voice. I don't want to be percieved as a paranoid nut case. Preferravly I want my bosses and collegues to understand the privacy concerns and dangers of using a "cloud sollution". Unfortunately thay are totally ignorant to the field of technology and the explanation/examples need to translate to the lay person.

(page 2) 23 comments
sorted by: hot top controversial new old
[-] WitchHazel@lemmygrad.ml 2 points 8 months ago

Unfortunately a guy I know works for a gov hospital and they've used such technology for over a decade at this point. It seems unavoidable.

[-] lorty@lemmygrad.ml 1 points 8 months ago

This is really weird. Is it common in other countries for doctors to not input the data in the system themselves?

[-] FlappyBubble@lemmy.ml 1 points 8 months ago

I don't know if it's common practise in other countries. In Sweden where I work it is. I think the rationale is the following:

  • It's a lot faster to use a voice recorder.
  • A doctor's time is worth a lot more than a secretary's (in the sense of pay and rarity)
  • Using a voice recorder lets us review lab results, radiology etc at the same time as recording, not having to switch between tasks. -Doctorss wont have to be good spellers or think about building well thought out sentences. We also dont have to look up classification codes for procedures and diagnoses. All this will be done by the secretary.

Of course we have to review the teanscribed result. At my hospital, all doctors carry smart cards and use the personal stoed private key to digitally sign every transcribed medical record entry.

[-] SheeEttin@programming.dev 1 points 8 months ago

What, exactly, are your privacy concerns about this?

[-] 520@kbin.social 1 points 8 months ago* (last edited 8 months ago)

Not OP but if I were him/her: Leakage of patient data. Even if OP isn't responsible, simply being tied to an incident like this can look very bad in fields that rely heavily on reputation.

AI models are known to leak this kind of information, there are news articles all over

load more comments (2 replies)
[-] ZILtoid1991@kbin.social 1 points 8 months ago
  1. Go to the Minecraft servers of OpenAI and similar corporations.
  2. Find a room called "AI server room", all while avoiding of defeating the mobs protecting the area.
  3. Destroy everything there.
  4. Go to the offices.
  5. Destroy everything there.
load more comments
view more: ‹ prev next ›
this post was submitted on 16 Feb 2024
184 points (96.0% liked)

Privacy

31601 readers
561 users here now

A place to discuss privacy and freedom in the digital world.

Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.

In this community everyone is welcome to post links and discuss topics related to privacy.

Some Rules

Related communities

Chat rooms

much thanks to @gary_host_laptop for the logo design :)

founded 5 years ago
MODERATORS