164
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 06 Jul 2023
164 points (98.2% liked)
Technology
59259 readers
1463 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
Not necessarily, something that overt would be obvious and avoided. I'm pretty sure that what they're looking for is more for subtle biases caused by bad datasets to train the AI.
For instance, you train the AI and tell it which candidates are good or bad. But maybe, by pure happenstance, the best candidates in your dataset are all male. If so, the AI might be accidentally trained to believe that all good candidates are male.
I remember hearing about a high-profile case where the AI would dock points if someone's resume listed them as participating in women's sports as an extracurricular, while giving extra points if it listed them as participating in men's sports.
Also, bias doesn't necessarily have to come from happenstance. Unfortunately, humans tend to have unconcious (or, sometimes, not-so-unconcious) biases against women and people of color. There was a study where researchers sent identical resumes to a random group of recruiters-- but half of the resumes had a male name and half had a female name.
They found that both male and female recruiters were more likely to rate the resumes with the male name higher and be more likely to recommend they be advanced to the next round of interviews. IIRC, similar studies have found similar results if you give the resumes a "Black sounding" name versus a "white sounding" name.
So if you train an AI on your own company's hiring data-- which is likely to be tainted by the unconcious bias of your own recruiters and hiring managers-- then the AI might pick up on that and replicate it in its results.
very interesting. somehow, resumes should be ranked with points. without gender or race or name. a point system based on... I guess merit? credentials? experience?
I feel like this should be a real thing. truly. but how.
Or you train the AI based on data from humans who are racist. If an AI learns how to hire from a racist set of human managers, it may take those biases into account when it is choosing who to hire.