99

Those experts said some of the invented text — known in the industry as hallucinations — can include racial commentary, violent rhetoric and even imagined medical treatments.

you are viewing a single comment's thread
view the rest of the comments
[-] BabaIsPissed@hexbear.net 6 points 1 month ago

This is fucked, you don't use a black box approach in anything high risk without human supervision. Whisper probably could be used to help accelerate a transcriptions done by an expert, maybe some sort of "first pass" that needs to be validated, but even then it might not help speed things up and might impact quality (see coding with copilot). Maybe also use the timestamp information for some filtering of the most egregious hallucinations, or a bespoke fine-tuning setup (assuming it was fine-tuned it the first place)? Just spitballing here, I should probably read the paper to see what the common error cases are.

It's funny, because this is the openAI model I had the least cynicism towards, did they bazinga it up when I wasn't looking?

this post was submitted on 29 Oct 2024
99 points (100.0% liked)

technology

23313 readers
16 users here now

On the road to fully automated luxury gay space communism.

Spreading Linux propaganda since 2020

Rules:

founded 4 years ago
MODERATORS