288

New research shows driverless car software is significantly more accurate with adults and light skinned people than children and dark-skinned people.

you are viewing a single comment's thread
view the rest of the comments
[-] MyTurtleSwimsUpsideDown@kbin.social -1 points 1 year ago* (last edited 1 year ago)

Wow. that’s all kinds of incorrect

It’s not a discriminatory bias or even one that can really have anything done about it.

It is absolutely data training bias. Whether it is the data that ML was trained on or the data that programmers were trained on is irrelevant. This is a problem of the computer not recognizing that a human is a human

It’s purely physics.

It is not. See below:

Is it harder to track smaller objects or larger ones?

No, not if the scale of your hardware is correct. A 3’ tall human may be smaller than a 6’ one, but it is larger than a 10” traffic light lens or a 30” stop sign. The systems do not have quite as much trouble recognizing those smaller objects. This is a problem of the computer not recognizing that the human is a human.

Is it harder for an optical system to track something darker. In any natural scene.

Also no. If that were the case, then we would have problems with collision bias against darker vehicles, or not being able to recognize the black asphalt of the road. Optical systems do not rely on the absolute signal strength of an object. they rely on contrast. A darker skin tone would only have low contrast against a background with a similar shade, and that doesn’t even account for clothing which usually covers most of a persons body. Again, this is a problem of the computer not recognizing that the human is a human.

smaller and darker individuals have less signal. Less signal means lower probability of detection,

No, they have different signals. that signal needs to be compared to the background to determine whether it exists and where it is, and then compared to the dataset to determine what it is. This is still a problem of the computer not recognizing that the human is a human.

It’s the same reason a stealth bomber is harder to track than a passenger plane. Less signal.

Close, but not quite.

  1. In this case the “less signal” only works because it is compared to a low signal background, creating a low contrast image. It is more like camouflage than invisibility. Radar uses a single source of “illumination“ against a mostly empty backdrop so the background is “dark”, like looking up at the night sky with a flashlight.
  2. The less signal is not because the plane is optically dark. It has a special coating that absorbs some of the radar illumination and a special shape that scatters some of the radar illumination, coming from that single source, away from the single point sensor. Humans of any skin tone are not specially designed to absorb and scatter optical light from any particular type of light source away from any particular sensor. Even at night, a vehicle should have a minimum of 2 headlights as sources of optical illumination (as well as streetlights, other vehicles. buildings, signs and other light pollution) and multiple sensors. Furthermore, the system should be designed to demand manual control as it approaches insufficient illumination to operate.

This is a problem of the computer not recognizing that the human is a human.

this post was submitted on 31 Aug 2023
288 points (93.9% liked)

News

22890 readers
3711 users here now

Welcome to the News community!

Rules:

1. Be civil


Attack the argument, not the person. No racism/sexism/bigotry. Good faith argumentation only. This includes accusing another user of being a bot or paid actor. Trolling is uncivil and is grounds for removal and/or a community ban.


2. All posts should contain a source (url) that is as reliable and unbiased as possible and must only contain one link.


Obvious right or left wing sources will be removed at the mods discretion. We have an actively updated blocklist, which you can see here: https://lemmy.world/post/2246130 if you feel like any website is missing, contact the mods. Supporting links can be added in comments or posted seperately but not to the post body.


3. No bots, spam or self-promotion.


Only approved bots, which follow the guidelines for bots set by the instance, are allowed.


4. Post titles should be the same as the article used as source.


Posts which titles don’t match the source won’t be removed, but the autoMod will notify you, and if your title misrepresents the original article, the post will be deleted. If the site changed their headline, the bot might still contact you, just ignore it, we won’t delete your post.


5. Only recent news is allowed.


Posts must be news from the most recent 30 days.


6. All posts must be news articles.


No opinion pieces, Listicles, editorials or celebrity gossip is allowed. All posts will be judged on a case-by-case basis.


7. No duplicate posts.


If a source you used was already posted by someone else, the autoMod will leave a message. Please remove your post if the autoMod is correct. If the post that matches your post is very old, we refer you to rule 5.


8. Misinformation is prohibited.


Misinformation / propaganda is strictly prohibited. Any comment or post containing or linking to misinformation will be removed. If you feel that your post has been removed in error, credible sources must be provided.


9. No link shorteners.


The auto mod will contact you if a link shortener is detected, please delete your post if they are right.


10. Don't copy entire article in your post body


For copyright reasons, you are not allowed to copy an entire article into your post body. This is an instance wide rule, that is strictly enforced in this community.

founded 1 year ago
MODERATORS