289
submitted 1 year ago* (last edited 1 year ago) by TheOneWithTheHair@lemmy.world to c/news@lemmy.world
you are viewing a single comment's thread
view the rest of the comments
[-] halcyoncmdr@lemmy.world 0 points 1 year ago

A person laying on the ground in a crosswalk was likely never considered by the team to include in their training data. Those outlier situations are exactly what real world data is needed for. And the only way to properly train for most of these situations is to drive in the real world. The real world isn't perfect situations and nice lines on fresh asphalt so while base training in perfect situations is useful, it will still miss the exact same situation in a real world environment with crappy infrastructure.

Not sure what or how Cruise uses the data collected in real-time, but I can see camera visuals categorizing a person laying in the crosswalk as something like damage to painted lines, and small debris that can be ignored. Other sensors like radar and lidar might have categorized returns as something like echoes or false results that could be ignored, again because a person laying in the crosswalk is extremely unlikely. False data returns happen all the time with things like radar and lidar, millions of data points are ignored as outliers or info that can be safely ignored, and sometimes that categorization is incorrect.

[-] postmateDumbass@lemmy.world 29 points 1 year ago

Well if you know you need backup with edge cases, why isnt there a human in the car with controls?

[-] Duranie@lemmy.film 8 points 1 year ago

Not only that, but no matter whether it can identify a person as a person, cars shouldn't be driving over objects that are child sized or larger.

[-] Chozo@kbin.social 4 points 1 year ago* (last edited 1 year ago)

These cars won't even drive over a fire hose laid out across the road (this was literally a test we did regularly when I worked on Waymo's SDCs). They definitely won't drive over a person (intentionally).

Inertia is a factor here. Cars can't stop on a dime. By the time the pedestrian was knocked in front of the SDC, there was likely not enough time to have possibly avoided hitting her, just due to physics.

[-] JoBo@feddit.uk 14 points 1 year ago

A person laying on the ground in a crosswalk was likely never considered by the team to include in their training data

I didn't bother reading any further than this. The person was on the crosswalk when both cars started moving. Neither car should have been moving while anyone was still on the crosswalk.

[-] MagicShel@programming.dev 5 points 1 year ago

That was the exact moment I called bullshit as well. You'd damn well better plan for people tripping and falling. It happens all the time, but generally is pretty minor if not exacerbated by being run over. This is like saying they didn't train it on people holding canes or in wheelchairs.

[-] JoBo@feddit.uk 5 points 1 year ago

It's not about the ability to recognise someone lying in the road (although they obviously do need to be able to recognise something like that).

She was still walking, upright, on the crosswalk when both cars started moving. No car, driverless or otherwise, should be moving forward just because the lights changed.

[-] mars296@kbin.social 2 points 1 year ago

Thats the whole point of their comment. The car did not recognize anyone was on the crosswalk because it was never trained to look for people laying in the crosswalk.

[-] SheeEttin@lemmy.world 3 points 1 year ago

And that's fine. But if it's unable to recognize any object in the road, it's not fit for purpose. The fact that the object was a person just makes it so much worse.

[-] mars296@kbin.social 1 points 1 year ago

Agreed. I'm not defending Cruise at all. They should have humans in the car if they are testing. Or at least a drone-style driver sitting in a room watching a camera feed. I wonder if the car thought there was just a speed bump ahead. Some speed bumps are striped similar to crosswalks. I can see situations where the autopilot can't determine if something is a speed bump or genuine obstruction (either false positive or negative).

[-] Chozo@kbin.social 1 points 1 year ago

They are 100% trained on bodies laying prone on the ground.

[-] JoBo@feddit.uk 0 points 1 year ago

She was standing up when the cars started moving.

[-] Nurchu@programming.dev 1 points 1 year ago

I actually work at one of these AV companies. We definitely have training data on adults and children laying down. I'd be very very very surprised if Cruise doesn't due to all the people laying down on the sidewalks in SF. In addition, the clarity of the lidar/camera data on objects on the road is very clear. You can see the dips and potholes in the road as well as specifically see the raises of the painted lines. There's no way they weren't tracking the person.

I could see predictions on the pedestrian saying the coast is clear. Once the initial crash happens, there likely isn't enough room to stop in time even with a max break.

this post was submitted on 04 Oct 2023
289 points (97.7% liked)

News

23268 readers
1649 users here now

Welcome to the News community!

Rules:

1. Be civil


Attack the argument, not the person. No racism/sexism/bigotry. Good faith argumentation only. This includes accusing another user of being a bot or paid actor. Trolling is uncivil and is grounds for removal and/or a community ban. Do not respond to rule-breaking content; report it and move on.


2. All posts should contain a source (url) that is as reliable and unbiased as possible and must only contain one link.


Obvious right or left wing sources will be removed at the mods discretion. We have an actively updated blocklist, which you can see here: https://lemmy.world/post/2246130 if you feel like any website is missing, contact the mods. Supporting links can be added in comments or posted seperately but not to the post body.


3. No bots, spam or self-promotion.


Only approved bots, which follow the guidelines for bots set by the instance, are allowed.


4. Post titles should be the same as the article used as source.


Posts which titles don’t match the source won’t be removed, but the autoMod will notify you, and if your title misrepresents the original article, the post will be deleted. If the site changed their headline, the bot might still contact you, just ignore it, we won’t delete your post.


5. Only recent news is allowed.


Posts must be news from the most recent 30 days.


6. All posts must be news articles.


No opinion pieces, Listicles, editorials or celebrity gossip is allowed. All posts will be judged on a case-by-case basis.


7. No duplicate posts.


If a source you used was already posted by someone else, the autoMod will leave a message. Please remove your post if the autoMod is correct. If the post that matches your post is very old, we refer you to rule 5.


8. Misinformation is prohibited.


Misinformation / propaganda is strictly prohibited. Any comment or post containing or linking to misinformation will be removed. If you feel that your post has been removed in error, credible sources must be provided.


9. No link shorteners.


The auto mod will contact you if a link shortener is detected, please delete your post if they are right.


10. Don't copy entire article in your post body


For copyright reasons, you are not allowed to copy an entire article into your post body. This is an instance wide rule, that is strictly enforced in this community.

founded 1 year ago
MODERATORS