184
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 16 Feb 2024
184 points (96.0% liked)
Privacy
32130 readers
748 users here now
A place to discuss privacy and freedom in the digital world.
Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.
In this community everyone is welcome to post links and discuss topics related to privacy.
Some Rules
- Posting a link to a website containing tracking isn't great, if contents of the website are behind a paywall maybe copy them into the post
- Don't promote proprietary software
- Try to keep things on topic
- If you have a question, please try searching for previous discussions, maybe it has already been answered
- Reposts are fine, but should have at least a couple of weeks in between so that the post can reach a new audience
- Be nice :)
Related communities
much thanks to @gary_host_laptop for the logo design :)
founded 5 years ago
MODERATORS
Shouldn't that be a HIPAA violation? Like you can't in good conscious guarantee that the patient data isn't being used for anything but the healthcare.
My question is not a legal one. There probably are legal obstacles for my hospital in this case but HIPAA is not applicable in my country.
I'd primarily like to get your opinions of how to effectively present my case for my bosses against using a non local model for this.
Look to your local health privacy laws. Most countries have that tightly controlled in such a way that this use of AI is illegal.
Your question is not a legal one, but a legal argument can be a very persuasive one.
It is until they prove it isn't, which they might not be able to do. Many trusted 23andme only to see private data stolen. Make the company prove the security in place and the methods ensuring privacy, because you'll essentially be liable for any failures of the system from a lack of due diligence.
Voice recognition dictation has been used in the medical field for over a decade, probably even longer. My regional health system of multiple hospitals and clinics has been using an electronic based, like Dragon dictation, solution since at least 2012. Unfortunately in this case op is being overly paranoid and behind the times. I'm all for privacy but the HIPAA implications have already been well sorted out. They need to either learn to type faster or use the system provided that will increase their productivity and save the health system an fte that used to be used on their transcriptionist which can not be used more directly to care for patients.
"Overly paranoid", with the practically-daily breaches of cloud-based systems today?
It is true that Dragon and similar apps have been used for years. But I don't think it's fair to say OP is being paranoid and a luddite. Data breaches in the cloud are a weekly occurrence, and OP wanting to protect their voice / biometrics is not foolish it's smarter than the average bear. You can change a compromised password. You can't change your biometrics or voice.
Also, those products were used on local networks for many years before they entered the cloud. They gradually reduce our privacy over time, getting people numb to it.
I think the issue is moreso that you're sending confidential health data to a 3rd party, which is where you lose control. You don't know the intentions of people looking to steal that data, and you need to consider the worst possible outcome and guard against those. AI training is just one option. Get creative, what could you do with a doctor's voice and their patient's private medical history?
Simplest solution is to stop the arrangement until the company can prove data security on their end or implement an offline solution on local servers not connected to the internet.