288

New research shows driverless car software is significantly more accurate with adults and light skinned people than children and dark-skinned people.

top 50 comments
sorted by: hot top controversial new old
[-] TrismegistusMx@lemmy.world 29 points 1 year ago* (last edited 1 year ago)

Built by Republican engineers.

I've seen this episode of Better Off Ted.

[-] cloudpunk@sh.itjust.works 22 points 1 year ago

Veronica: The company's position is that it's actually the opposite of racist, because it's not targeting black people. It's just ignoring them. They insist the worst people can call it is "indifferent."

Ted: Well, they know it has to be fixed, right? Please... at least say they know that.

Veronica: Of course they do, and they're working on it. In the meantime they'd like everyone to celebrate the fact that it sees Hispanics, Asians, Pacific Islanders, and Jews.

[-] SpeedLimit55@lemmy.world 12 points 1 year ago

That show ended too soon.

[-] SeaJ@lemm.ee 3 points 1 year ago

One reason why diversity in the workplace is necessary. Most of the country is not 40 year old white men so products need to be made with a more diverse crowd in mind.

[-] ChonkyOwlbear@lemmy.world 25 points 1 year ago

Humans have the same problem, so it's not surprising.

[-] Cethin@lemmy.zip 7 points 1 year ago

Yeah, this is a dumb article. There is something to be said about biased training data, but my guess is that it's just harder to see people who are smaller and who have darker skin. It has nothing to do with training data and just has the same issues our eyes do. There is something to be said about Tesla using only regular cameras instead of Lidar, which I don't think would have any difference based on skin tone, but smaller things will always be harder to see.

[-] electrogamerman@lemmy.world 2 points 1 year ago

I can spot kids and dark skinned people pretty well

[-] Endomlik@reddthat.com 20 points 1 year ago

Seems this will be always the case. Small objects are harder to detect than larger objects. Higher contrast objects are easier to detect than lower contrast objects. Even if detection gets 1000x better, these cases will still be true. Do you introduce artificial error to make things fair?

Repeating the same comment from a crosspost.

[-] kibiz0r@midwest.social 14 points 1 year ago

All the more reason to take this seriously and not disregard it as an implementation detail.

When we, as a society, ask: Are autonomous vehicles safe enough yet?

That’s not the whole question.

…safe enough for whom?

[-] Mac@mander.xyz 7 points 1 year ago

Also what is the safety target? Humans are extremely unsafe. Are we looking for any improvement or are we looking for perfection?

[-] kibiz0r@midwest.social 3 points 1 year ago* (last edited 1 year ago)

This is why it’s as much a question of philosophy as it is of engineering.

Because there are things we care about besides quantitative measures.

If you replace 100 pedestrian deaths due to drunk drivers with 99 pedestrian deaths due to unexplainable self-driving malfunctions… Is that, unambiguously, an improvement?

I don’t know. In the aggregate, I guess I would have to say yes..?

But when I imagine being that person in that moment, trying to make sense of the sudden loss of a loved one and having no explanation other than watershed segmentation and k-means clustering… I start to feel some existential vertigo.

I worry that we’re sleepwalking into treating rationalist utilitarianism as the empirically correct moral model — because that’s the future that Silicon Valley is building, almost as if it’s inevitable.

And it makes me wonder, like… How many of us are actually thinking it through and deliberately agreeing with them? Or are we all just boiled frogs here?

[-] Duamerthrax@lemmy.world 5 points 1 year ago

Can everyone who feels the need to jog at twilight hours please wear bring colors? I get anxiety driving to my suburban friends.

[-] Sconrad122@lemmy.world 7 points 1 year ago

They should. But also, good. You should absolutely feel anxiety operating a multi-ton piece of heavy machinery. Even if everybody was super diligent about making themselves visible, there would still be the off cases. Someone's boss held them late and they missed the last bus so now they need to walk home in the dark when they dressed expecting to ride home in the day. Someone is down on their luck and needs to get to the nearest homeless resource and doesn't have access to bright clothes. Drivers should never feel comfortable that obstacles will always be reflective and bright. Our transportation infrastructure should not be built to lull them into that false sense of comfort.

load more comments (2 replies)
[-] huginn@feddit.it 14 points 1 year ago

I wonder what the baseline is for the average driver spotting those same people? I expect it's higher than the learning algo but by how much?

[-] luckyhunter@lemmy.world 13 points 1 year ago
[-] CluckN@lemmy.world 3 points 1 year ago

They touch on this topic in Cars 2.

[-] luckyhunter@lemmy.world 1 points 1 year ago

Italian fork lift pit crew FTW!

load more comments (2 replies)
[-] lefixxx@lemmy.world 10 points 1 year ago
load more comments (7 replies)
[-] Astrealix@lemmy.world 10 points 1 year ago
[-] GillyGumbo@lemmy.world 7 points 1 year ago* (last edited 1 year ago)

Okay but being better able to detect X doesn't mean you are unable to detect Y well enough. I'd say the physical attributes of children and dark-skinned people would make it more difficult for people to see objects as well, under many conditions. But that doesn't require a news article.

[-] eltimablo@kbin.social 3 points 1 year ago

But that doesn’t require a news article.

Most things I read on the Fediverse don't.

[-] FarceMultiplier@lemmy.ca 6 points 1 year ago

Elon, probably: "Go be young and black somewhere else"

[-] FlyingSquid@lemmy.world 6 points 1 year ago

Those types of people are the ones who Elon Musk calls "acceptable losses."

[-] dreadedsemi@lemmy.world 6 points 1 year ago

What about hairy people? Hope cars won't think I'm a racoon and goes grill time!

load more comments (1 replies)
[-] worsedoughnut@lemdro.id 3 points 1 year ago* (last edited 1 year ago)

Unfortunate but not shocking to be honest.

Only recently did smartphone cameras get better at detecting darker skinned faces in software, and that was something they were probably working towards for a decent while. Not all that surprising that other camera tech would have to play catch up in that regard as well.

load more comments
view more: next ›
this post was submitted on 31 Aug 2023
288 points (93.9% liked)

News

23275 readers
1384 users here now

Welcome to the News community!

Rules:

1. Be civil


Attack the argument, not the person. No racism/sexism/bigotry. Good faith argumentation only. This includes accusing another user of being a bot or paid actor. Trolling is uncivil and is grounds for removal and/or a community ban. Do not respond to rule-breaking content; report it and move on.


2. All posts should contain a source (url) that is as reliable and unbiased as possible and must only contain one link.


Obvious right or left wing sources will be removed at the mods discretion. We have an actively updated blocklist, which you can see here: https://lemmy.world/post/2246130 if you feel like any website is missing, contact the mods. Supporting links can be added in comments or posted seperately but not to the post body.


3. No bots, spam or self-promotion.


Only approved bots, which follow the guidelines for bots set by the instance, are allowed.


4. Post titles should be the same as the article used as source.


Posts which titles don’t match the source won’t be removed, but the autoMod will notify you, and if your title misrepresents the original article, the post will be deleted. If the site changed their headline, the bot might still contact you, just ignore it, we won’t delete your post.


5. Only recent news is allowed.


Posts must be news from the most recent 30 days.


6. All posts must be news articles.


No opinion pieces, Listicles, editorials or celebrity gossip is allowed. All posts will be judged on a case-by-case basis.


7. No duplicate posts.


If a source you used was already posted by someone else, the autoMod will leave a message. Please remove your post if the autoMod is correct. If the post that matches your post is very old, we refer you to rule 5.


8. Misinformation is prohibited.


Misinformation / propaganda is strictly prohibited. Any comment or post containing or linking to misinformation will be removed. If you feel that your post has been removed in error, credible sources must be provided.


9. No link shorteners.


The auto mod will contact you if a link shortener is detected, please delete your post if they are right.


10. Don't copy entire article in your post body


For copyright reasons, you are not allowed to copy an entire article into your post body. This is an instance wide rule, that is strictly enforced in this community.

founded 1 year ago
MODERATORS