184
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 16 Feb 2024
184 points (96.0% liked)
Privacy
31601 readers
586 users here now
A place to discuss privacy and freedom in the digital world.
Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.
In this community everyone is welcome to post links and discuss topics related to privacy.
Some Rules
- Posting a link to a website containing tracking isn't great, if contents of the website are behind a paywall maybe copy them into the post
- Don't promote proprietary software
- Try to keep things on topic
- If you have a question, please try searching for previous discussions, maybe it has already been answered
- Reposts are fine, but should have at least a couple of weeks in between so that the post can reach a new audience
- Be nice :)
Related communities
Chat rooms
-
[Matrix/Element]Dead
much thanks to @gary_host_laptop for the logo design :)
founded 5 years ago
MODERATORS
What, exactly, are your privacy concerns about this?
My biometric data, in this case my voice. Training an AI, tailored to my voice, out of my control, hosted as a cloud solution.
Of course there is an aspect of patient confidenciality too, but this battle is already lost. The data in the medical records is already hosted outside of my hospital.
Sounds like a weak argument. They're not going to be inclined to operate a local ML system just for one or two people.
I would see if you can get a quote for locally-hosted transcription software you can run on your own, like Dragon Medical. Maybe reach out to your IT department to see if they already have a working relationship with Nuance for that software. If they're willing to get you started, you can probably just use that for dictation and nobody will notice or care.
Not OP but if I were him/her: Leakage of patient data. Even if OP isn't responsible, simply being tied to an incident like this can look very bad in fields that rely heavily on reputation.
AI models are known to leak this kind of information, there are news articles all over