123
submitted 1 year ago by 0x815@feddit.de to c/news@beehaw.org
you are viewing a single comment's thread
view the rest of the comments
[-] Lowbird@beehaw.org 11 points 1 year ago

Also, it's the type of thing that makes me very worried about the fact that most of the algorithms used in things like police facial recognition software, recidivism calculation software, and suchlike are proprietary black boxes.

There are - guaranteed - biases in those tools, whether in their processors or in the unknown datasets they're trained on, and neither police nor journalists can actually see the inner workings of the software to know what those biases are, to counterbalance them or to recognize if the software is so biased as to be useless.

this post was submitted on 06 Nov 2023
123 points (100.0% liked)

World News

22057 readers
95 users here now

Breaking news from around the world.

News that is American but has an international facet may also be posted here.


Guidelines for submissions:

These guidelines will be enforced on a know-it-when-I-see-it basis.


For US News, see the US News community.


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS