1084
top 50 comments
sorted by: hot top controversial new old
[-] blindbunny@lemmy.ml 152 points 1 year ago

This is what happens when a company has no diversity. Most companies dogfood their own production. Reminds me of Google's gorilla situation...

[-] Hubi@lemmy.world 66 points 1 year ago* (last edited 1 year ago)

This happens even on newer Toyotas, so it's not exactly company-specific. The issue is the biased training data used for the face recognition system.

[-] blindbunny@lemmy.ml 10 points 1 year ago* (last edited 1 year ago)

This seems more like an excuse. All these companies aren't using the same training data.

They literally never tested this on an asian person before selling in the vehicle...

[-] Fedizen@lemmy.world 73 points 1 year ago* (last edited 1 year ago)

Toyota is a japanese manufacturer. Likely they localize the feature and the localized version has the problem. Its completely possible they all contract the same software vendors in the US for certification reasons, resulting in similar problems.

load more comments (1 replies)
[-] systemglitch@lemmy.world 12 points 1 year ago* (last edited 1 year ago)

Your claim is a Japanese company never tested on Asian people? Would you place a bet on those odds?

[-] zaph@sh.itjust.works 9 points 1 year ago

Toyota in the US is more American than most American car companies. The tech being different isn't that big of a stretch.

load more comments (1 replies)
[-] Kaboom@reddthat.com 8 points 1 year ago

Toyota is literally Japanese

[-] brbposting@sh.itjust.works 23 points 1 year ago

Nikon is too

[-] magnetosphere@fedia.io 12 points 1 year ago

That’s what I was thinking. How did this slip by? If I recall correctly, Toyota is better than average when it comes to quality control. This is Boeing-level laziness/incompetence.

load more comments (1 replies)
[-] rustydrd@sh.itjust.works 122 points 1 year ago

Classic case of OWPITTS (only white people in the training set).

[-] Death_Equity@lemmy.world 23 points 1 year ago

I am guessing the car wasn't made by Subaru or Nissan and is from Ford, GMC, Tesla, BMW, or Mercedes.

[-] Cqrd@lemmy.dbzer0.com 38 points 1 year ago

My Subaru has a driver attention feature that's constantly going off if I sit up straight because I'm too tall 🫠

[-] pyre@lemmy.world 15 points 1 year ago
load more comments (1 replies)
[-] K4mpfie@feddit.de 13 points 1 year ago

Classic case of OSPITTS (only small people in the training set).

[-] Anyolduser@lemmynsfw.com 8 points 1 year ago
load more comments (4 replies)
[-] ClockworkN@lemmy.world 57 points 1 year ago

There's an episode of American Auto where they make a self-driving car that can't see black people. It's a good show. Check it out.

[-] ech@lemm.ee 21 points 1 year ago

Sounds like a similar episode of Better of Ted. Also a great show that only got 2 seasons.

load more comments (2 replies)
load more comments (1 replies)
[-] shadowbroker@lemm.ee 46 points 1 year ago

What does the system do when you wear sunglasses?

[-] UnderpantsWeevil@lemmy.world 15 points 1 year ago

Automatically voids the warranty

load more comments (1 replies)
load more comments (1 replies)
[-] mavu@discuss.tchncs.de 43 points 1 year ago

My car keeps screaming at me to keep my hands on the wheel, WHILE I'M FUCKING HOLDING IT.

[-] Thcdenton@lemmy.world 20 points 1 year ago

Dude. I'm never buying a new car. That shit is insane.

[-] theangryseal@lemmy.world 10 points 1 year ago

You black with a black steering wheel or white with a white one?

[-] Artyom@lemm.ee 9 points 1 year ago

My car's lane assist deactivates if the road is too straight because I haven't moved the steering wheel in too long. The only way to get it back is to swerve a bit.

load more comments (1 replies)
load more comments (8 replies)
[-] DogWater@lemmy.world 40 points 1 year ago

Lol just like the old Xbox Kinect failing miserably at seeing dark skinned black people correctly or at all

load more comments (1 replies)
[-] Thcdenton@lemmy.world 31 points 1 year ago

Sorry asians. You know what you have to do.

[-] sudo42@lemmy.world 30 points 1 year ago

There’s a Better Off Ted episode that hilariously addressed this issue.

[-] Spur4383@lemmy.world 16 points 1 year ago

That show should have more seasons.

[-] knobbysideup@sh.itjust.works 26 points 1 year ago* (last edited 1 year ago)

My Ford transit kept telling me to pull over to rest. It was a windy day.

load more comments (4 replies)
[-] shalafi@lemmy.world 26 points 1 year ago

My wife's Asian and only been driving 3-years. LMFAO, she would be a shaking crying mess if the car kept yelling at her to pay attention.

[-] Death_Equity@lemmy.world 9 points 1 year ago

Car manufacturer tried to make a safer and more attentive driver through monitor and warning systems, accidentally causes crippling anxiety instead.

[-] DarkThoughts@fedia.io 21 points 1 year ago

I'm sure it is possible to disable this feature?

[-] ASeriesOfPoorChoices@lemmy.world 40 points 1 year ago

company vehicle. might not be allowed to for insurance reasons. 🤷‍♂️

[-] magikmw@lemm.ee 15 points 1 year ago

The fun part is, company will get a higher premium after they go through data from that car and it's "sleepy" driver.

load more comments (3 replies)
[-] magnetosphere@fedia.io 13 points 1 year ago

Sue the company for racism and creating a hostile work environment.

I’m not sure how serious I am.

[-] HawlSera@lemm.ee 17 points 1 year ago

You'd think they'd have learned from all the cameras that can't see black people....

This is racist as shit.

It's actually probably not racist as shit.

I'm white with larger eyes. My car tells me the same thing. Constantly.

Yes I understand that facial recognition software is usually racist as shit, but this particular situation may just be shitty software rather than racist shitty software.

load more comments (2 replies)
[-] JayDee@lemmy.ml 16 points 1 year ago

I remember this episode from Better Off Ted.

[-] smb@lemmy.ml 14 points 1 year ago

i'ld try to stick some googly eyes on a headband to wear when driving. if it does not help, its at least good for a selfie.

[-] Empricorn@feddit.nl 14 points 1 year ago

Can people see why "DEI" programs are genuinely good things yet?

[-] fishbone@lemmy.world 7 points 1 year ago

Just in case anyone doesn't know the acronym: DEI is Diversity, equity, and inclusion.

https://en.wikipedia.org/wiki/Diversity,_equity,_and_inclusion

[-] Duamerthrax@lemmy.world 6 points 1 year ago

I can see small teams not having the personal to account for every possiblity, but this should have gotten picked up in testing and not made it to production. There was an automatic soap dispenser that couldn't see dark skin, but that didn't make it to full scale production.

load more comments (2 replies)
[-] DragonTypeWyvern@midwest.social 12 points 1 year ago
[-] toiletobserver@lemmy.world 9 points 1 year ago

My white guy sucks

load more comments (3 replies)
[-] billwashere@lemmy.world 7 points 1 year ago
load more comments (1 replies)
[-] RampantParanoia2365@lemmy.world 6 points 1 year ago

There has to be a way to calibrate it, no? Something like this can't be designed without setting a baseline, and surely there's a ton of variance.

load more comments (1 replies)
load more comments
view more: next ›
this post was submitted on 02 Jun 2024
1084 points (98.7% liked)

Facepalm

3252 readers
2 users here now

founded 2 years ago
MODERATORS