197
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 08 Oct 2023
197 points (91.2% liked)
World News
32359 readers
362 users here now
News from around the world!
Rules:
-
Please only post links to actual news sources, no tabloid sites, etc
-
No NSFW content
-
No hate speech, bigotry, propaganda, etc
founded 5 years ago
MODERATORS
Donald - "they just let you do it" - "beauty pageant dressing room" - Trump
🙄
Combine this with Kutcher not knowing Masterson and scientology numerous trafficking accusations really illuminates how unserious they were about the program.
This was always just a PR move for Kutcher, so he doesn't care if he is doing damage to legitimate sex workers. Much like FOSTA/SESTA, it's real easy for anyone who wants to boost their image, lawmakers included, to play the hero as you take a "protect the children at all costs zero tolerance" stance and handwave away the harm you do "helping".
Well said
Of course, that's the point.
No, what the ACLU did was knowingly leave the default setting of 80% confidence and do a "surprised" face when they got almost exactly 20% false detections.
They knew exactly what they were doing.
Amazon even responded to their claims criticising the lack of proper setup for such a complex system. But the ACLU's excuse was that it was the default setting, so they just used whatever it came with out of the box.
So all they managed to prove is that the default setting isn't adequate for accurate identification, and has nothing to do with what the system is able to do when correctly configured.
Edit: I see I'm being downvoted for stating facts.
Is there any evidence to suggest that law enforcement agencies have these complex systems properly configured? I've seen plenty of articles talking about minorities being arrested after some facial recognition software misidentified them. Claiming that the ACLU isn't using the software properly doesn't mean that anyone else is using it properly.
You're talking about something else entirely. The ACLU's argument is "these systems are so bad we can't rely on them" and your argument is "law enforcement may not have them configured correctly".
One of those is factually false.
That being said, every FR system is built differently, and have their own advantages and considerations. But from what I've seen in the news over the past few years is almost always a policy and procedure failure. At some point between using a photo of such low quality that it shouldn't be used to the verifying officer looking at the source photo and recognizing the current suspect are different people, something broke down.
I'm actually astonished at how bad the average person is at comparing photos of people. Just look up the conspiracy nonsense these flat earthers go on about regarding the Challenger accident. They are convinced that each person that died are actually still alive and living under a new name. Then they show their evidence and I couldn't believe what I was seeing. Sure, these people are similar enough that they could fit a verbal description, but when you actually compare features it's so easy to see they're all different people and can't be the same.
I know it's like that with some cops, because I know some people in emergency services that have been taking FR courses. They told me that so many departments (fire, police, 911 dispatch, forensics, etc) are being trained on it. And not for the software, it's for physically identifying people. With this tech and these false arrests I guess it's come to light that some people, cops or not, lack the fundamental ability to see minor but critical differences in facial anatomy.
Ultimately, whatever a computer system says, a person is making the final decision to arrest these people. This is where the issue lies.
Edit: I guess downvotes mean "I don't like that you're right" here also. I worked in the FR field for almost a decade. I'm familiar with the topic.