481
An Algorithm Told Police She Was Safe. Then Her Husband Killed Her.
(www.nytimes.com)
This is a most excellent place for technology news and articles.
Could a human have judged it better? Maybe not. I think a better question to ask is, "Should anyone be sent back into a violent domestic situation with no additional protection, no matter the calculated risk?" And as someone who has been on the receiving end of that conversation and later narrowly escaped a total-family-annihilation situation, I would say no...no one should be told that, even though they were in a terrifying, life-threatening situation, they will not be provided protection, and no further steps will be taken to keep them from being injured again, or from being killed next time. But even without algorithms, that happens constantly...the only thing the algorithm accomplishes is that the investigator / social worker / etc doesn't have to have any kind of personal connection with the victim, so they don't have to feel some kind of way for giving an innocent person a death sentence because they were just doing what the computer told them to.
Final thought: When you pair this practice with the ongoing conversation around the legality of women seeking divorce without their husband's consent, you have a terrifying and consistently deadly situation.
Yep. The ones who manage to slip notes to their veterinarian to help them get away are the exception.
Reading stuff like this makes me sick. All is not well with the world.
This even works for people pulling the trigger. Following orders, sed lex dura lex, et cetera ad infinitum.
Yep! For all the psych nerds, it's pretty much a direct lift of the Milgram Shock Experiment