289
submitted 1 year ago* (last edited 1 year ago) by TheOneWithTheHair@lemmy.world to c/news@lemmy.world
you are viewing a single comment's thread
view the rest of the comments
[-] JoBo@feddit.uk 74 points 1 year ago

Should the Cruise car have not started moving if there was a person still on the crosswalk? This whole sad affair raises many questions.

There are some questions but "should cars start moving while a person is still on the crosswalk?" is surely not one of them.

[-] medgremlin@lemmy.sdf.org 10 points 1 year ago

A different question I have is whether or not the cars have transponders or other communication devices to automatically call emergency services in case of accidents. I'm assuming not because they would probably have a lot of junk calls and I doubt the company would have spent the time to create an algorithm for when to call 911 if they didn't create an algorithm for what to do if there's a pedestrian in a crosswalk.

That's one of the big downsides of these driverless cars: if a human accidentally ran over the victim, they have the capability to get out of the car to assess the situation, call 911, and offer aid to the victim. An empty car can only ever just sit there with its hazard lights on and maybe call for emergency services.

[-] Unaware7013@kbin.social 12 points 1 year ago

The auto-driving company should be required to have something like an on-star operator available any time the vehicle receives an impact/shock above a certain threshold and any time physical safety measures are required. The local governments should not have to pay for the externalities created by these 'disruptive technology' jerks, especially when there are literal lives on the line.

[-] Chozo@kbin.social 8 points 1 year ago

Cruise has this. I actually applied for the position after my contract doing the same thing with Waymo ended (but was unfortunately ghosted). They've got a team of people who monitor the fleets in real time, mostly just helping a "stuck" car by identifying any objects or street signs that the SDC has been confused by, so that it can proceed with its course. But they also have protocols in place for reporting any collisions as soon as they've happened, as well. Willing to bet that Cruise called emergency services before anybody on the scene even did.

[-] halcyoncmdr@lemmy.world 0 points 1 year ago

A person laying on the ground in a crosswalk was likely never considered by the team to include in their training data. Those outlier situations are exactly what real world data is needed for. And the only way to properly train for most of these situations is to drive in the real world. The real world isn't perfect situations and nice lines on fresh asphalt so while base training in perfect situations is useful, it will still miss the exact same situation in a real world environment with crappy infrastructure.

Not sure what or how Cruise uses the data collected in real-time, but I can see camera visuals categorizing a person laying in the crosswalk as something like damage to painted lines, and small debris that can be ignored. Other sensors like radar and lidar might have categorized returns as something like echoes or false results that could be ignored, again because a person laying in the crosswalk is extremely unlikely. False data returns happen all the time with things like radar and lidar, millions of data points are ignored as outliers or info that can be safely ignored, and sometimes that categorization is incorrect.

[-] postmateDumbass@lemmy.world 29 points 1 year ago

Well if you know you need backup with edge cases, why isnt there a human in the car with controls?

[-] Duranie@lemmy.film 8 points 1 year ago

Not only that, but no matter whether it can identify a person as a person, cars shouldn't be driving over objects that are child sized or larger.

[-] Chozo@kbin.social 4 points 1 year ago* (last edited 1 year ago)

These cars won't even drive over a fire hose laid out across the road (this was literally a test we did regularly when I worked on Waymo's SDCs). They definitely won't drive over a person (intentionally).

Inertia is a factor here. Cars can't stop on a dime. By the time the pedestrian was knocked in front of the SDC, there was likely not enough time to have possibly avoided hitting her, just due to physics.

[-] JoBo@feddit.uk 14 points 1 year ago

A person laying on the ground in a crosswalk was likely never considered by the team to include in their training data

I didn't bother reading any further than this. The person was on the crosswalk when both cars started moving. Neither car should have been moving while anyone was still on the crosswalk.

[-] MagicShel@programming.dev 5 points 1 year ago

That was the exact moment I called bullshit as well. You'd damn well better plan for people tripping and falling. It happens all the time, but generally is pretty minor if not exacerbated by being run over. This is like saying they didn't train it on people holding canes or in wheelchairs.

[-] JoBo@feddit.uk 5 points 1 year ago

It's not about the ability to recognise someone lying in the road (although they obviously do need to be able to recognise something like that).

She was still walking, upright, on the crosswalk when both cars started moving. No car, driverless or otherwise, should be moving forward just because the lights changed.

[-] mars296@kbin.social 2 points 1 year ago

Thats the whole point of their comment. The car did not recognize anyone was on the crosswalk because it was never trained to look for people laying in the crosswalk.

[-] SheeEttin@lemmy.world 3 points 1 year ago

And that's fine. But if it's unable to recognize any object in the road, it's not fit for purpose. The fact that the object was a person just makes it so much worse.

[-] mars296@kbin.social 1 points 1 year ago

Agreed. I'm not defending Cruise at all. They should have humans in the car if they are testing. Or at least a drone-style driver sitting in a room watching a camera feed. I wonder if the car thought there was just a speed bump ahead. Some speed bumps are striped similar to crosswalks. I can see situations where the autopilot can't determine if something is a speed bump or genuine obstruction (either false positive or negative).

[-] Chozo@kbin.social 1 points 1 year ago

They are 100% trained on bodies laying prone on the ground.

[-] JoBo@feddit.uk 0 points 1 year ago

She was standing up when the cars started moving.

[-] Nurchu@programming.dev 1 points 1 year ago

I actually work at one of these AV companies. We definitely have training data on adults and children laying down. I'd be very very very surprised if Cruise doesn't due to all the people laying down on the sidewalks in SF. In addition, the clarity of the lidar/camera data on objects on the road is very clear. You can see the dips and potholes in the road as well as specifically see the raises of the painted lines. There's no way they weren't tracking the person.

I could see predictions on the pedestrian saying the coast is clear. Once the initial crash happens, there likely isn't enough room to stop in time even with a max break.

[-] Chozo@kbin.social 0 points 1 year ago

Without knowing what type of vehicle the first car was, it's hard to say how this played out. If it was a van or truck or something else that could've easily obstructed Cruise's LIDAR system or if the other vehicle stopped ahead of the crosswalk line, the SDC would've had little to no way of knowing that there was anybody in the crosswalk.

Can't know for sure unless Cruise releases the video to the public, which they're unlikely to do until the police do their investigation.

[-] SheeEttin@lemmy.world 6 points 1 year ago

If your vision of the crosswalk is obstructed, you don't proceed through until it's unobstructed. That's true whether it's radar, lidar, or vision. Truck in the way? Pull up as far as you can safely see, then look and proceed if clear.

this post was submitted on 04 Oct 2023
289 points (97.7% liked)

News

23376 readers
2943 users here now

Welcome to the News community!

Rules:

1. Be civil


Attack the argument, not the person. No racism/sexism/bigotry. Good faith argumentation only. This includes accusing another user of being a bot or paid actor. Trolling is uncivil and is grounds for removal and/or a community ban. Do not respond to rule-breaking content; report it and move on.


2. All posts should contain a source (url) that is as reliable and unbiased as possible and must only contain one link.


Obvious right or left wing sources will be removed at the mods discretion. We have an actively updated blocklist, which you can see here: https://lemmy.world/post/2246130 if you feel like any website is missing, contact the mods. Supporting links can be added in comments or posted seperately but not to the post body.


3. No bots, spam or self-promotion.


Only approved bots, which follow the guidelines for bots set by the instance, are allowed.


4. Post titles should be the same as the article used as source.


Posts which titles don’t match the source won’t be removed, but the autoMod will notify you, and if your title misrepresents the original article, the post will be deleted. If the site changed their headline, the bot might still contact you, just ignore it, we won’t delete your post.


5. Only recent news is allowed.


Posts must be news from the most recent 30 days.


6. All posts must be news articles.


No opinion pieces, Listicles, editorials or celebrity gossip is allowed. All posts will be judged on a case-by-case basis.


7. No duplicate posts.


If a source you used was already posted by someone else, the autoMod will leave a message. Please remove your post if the autoMod is correct. If the post that matches your post is very old, we refer you to rule 5.


8. Misinformation is prohibited.


Misinformation / propaganda is strictly prohibited. Any comment or post containing or linking to misinformation will be removed. If you feel that your post has been removed in error, credible sources must be provided.


9. No link shorteners.


The auto mod will contact you if a link shortener is detected, please delete your post if they are right.


10. Don't copy entire article in your post body


For copyright reasons, you are not allowed to copy an entire article into your post body. This is an instance wide rule, that is strictly enforced in this community.

founded 1 year ago
MODERATORS