209
you are viewing a single comment's thread
view the rest of the comments
[-] SuperSleuth@lemm.ee 13 points 1 year ago

Should a self-driving car face more rigorous tests than actual human drivers? Honest question.

[-] optissima@lemmy.world 28 points 1 year ago
[-] stopthatgirl7@kbin.social 8 points 1 year ago

Yes, because when there’s an accident with a person driving, you usually know exactly who is legally to blame in an accident. With self-driving, if the car accidentally hits and kills someone, who do you charge for it? There’s no one person you can point to for responsibility for if something goes wrong, like you can for a person responsible for an accident.

[-] IphtashuFitz@lemmy.world 15 points 1 year ago

Yes. A human brain can handle edge cases it’s never encountered before. Can a self driving car?

  • Ever stop at a red light only to have a police officer wave you through?

  • Ever encounter a car driving the wrong way down a one way street?

  • Ever come across a flooded out stretch of road? (if the road has no lines and the water is still it can be very deceptive looking)

These are a tiny number of things I’ve encountered over the past few years. I’m sure plenty of other drivers can provide other good examples. I’d want to know how a self driving car would handle itself in situations like these.

[-] merc@sh.itjust.works 1 points 1 year ago

Those are pretty basic conditions that I hope are already in the training data.

What about a wildfire evacuation? Police might have people driving on the wrong side of the highway to make use of all the lanes. Smoke might be obscuring everything. A human driver would know not to pay attention to any of the road signs in that situation without ever having been trained on it, but would a self-driving car?

Or, how about any situation where a police officer has to have a driver roll down the window to give them instructions for dealing with some unusual situation, like a chemical spill or a landslide.

Or, what about highway signs that have been shot by a shotgun so that it's hard to read? Or, what about novelty highway signs that a business might put up as a joke?

Self-driving cars definitely need to be tested against a much bigger range of situations than a human driver. Much as we might be baffled by their lack of common sense, the common sense of an average 16-year-old is still off the charts compared to an AI. Having said that, I know how bad many drivers are, and I wouldn't be surprised if the competent self-driving car organizations (Cruize, Waymo, etc.) are already better than an average driver under 99.9% of common scenarios.

[-] TopShelfVanilla@sh.itjust.works 0 points 1 year ago

How will the bot car handle itself out in the country? Dirt roads? Deer? Roadblock checkpoints full of bored, mean spirited cops.

[-] NeoNachtwaechter@lemmy.world 2 points 1 year ago* (last edited 1 year ago)

How will the bot car handle itself out in the country? Dirt roads?

They don't go there. They have their limits. Simple as that.

But when the police has ordered them there (for example, the good road must be emptied because of an emergency) then the trouble starts... now imagine not just one or two, but hundreds of them.

[-] FoxBJK@midwest.social 15 points 1 year ago

Human drivers should be facing more rigorous testing regardless. It’s horrifically easy to get a license… and then they never test you again for the rest of your life. That’s just insane when you think about it. My test was in 2002. Feels like I should have to retake it at some point.

[-] TenderfootGungi@lemmy.world 5 points 1 year ago

And take them away for bad driving. But we don’t because our entire transportation infrastructure, outside of a few cities namely NY, is built around everyone driving a car.

[-] snooggums@kbin.social 8 points 1 year ago

Yes because each person must learn on their own and have limited experience relative to the general public as a whole.

Self driving cars can 'learn' from all self driving cars and don't get tired, forget, or anything like that. While they shouldn't be held to perfection, they should absolutely be held to a higher standard than a human.

[-] NeoNachtwaechter@lemmy.world 5 points 1 year ago

Should a self-driving car face more rigorous tests than actual human drivers? Honest question

First: none of these automated cars would pass a German driver's license test. By far.

Second: of course you cannot compare tests for humans with tests for machines.

[-] merc@sh.itjust.works 1 points 1 year ago

What are the things you think would cause a self-driving car to fail the German test?

[-] NeoNachtwaechter@lemmy.world 1 points 1 year ago* (last edited 1 year ago)

Many. But the most obvious currently: they are going too slow.

And maybe the most funny: they are unable to turn their heads (in order to prove to the inspector that they are looking where it is required to look).

[-] merc@sh.itjust.works 2 points 1 year ago

Going too slow where, in the city? They're following traffic and/or speed limits. They also go on highways, but again, there are speed limits. I'm sure they could have a version for Germany where the max speed was adjusted to be appropriate for the autobahn.

As for turning their heads, does Germany make no exceptions for people with disabilities? I'm sure that they could implement a version of "show the instructor you're paying attention to the right thing".

[-] NeoNachtwaechter@lemmy.world 0 points 1 year ago

Going too slow where, in the city?

Everywhere.

exceptions for people with disabilities?

Well, not that kind of exception where "the requirements are forgotten and we can make everything easier just for you".

I don't know this specific thing, but in general, a disabled person must have some kind of aid that fully compensates the disability.

Think of eye glasses: your eyes are bad, so you are required to wear glasses (or contacts). Note that it is on you. You get an extra order written into your license saying that you always have to wear them when driving, and they must fully compensate your vision - otherwise you are not allowed to drive.

[-] nxfsi@lemmy.world -3 points 1 year ago

Only Tesla self driving cars need to have more rigorous tests. Other brands are fine as it is because they have lidar.

[-] IphtashuFitz@lemmy.world 8 points 1 year ago

LiDAR isn’t some sort of magic eye. The self driving system is only as good as the software that takes the inputs from cameras, LiDAR, etc., processes them, and ensures safe operation of the car.

[-] nxfsi@lemmy.world 2 points 1 year ago

Finally someone who actually uses critical thinking instead of being an anti-Elon bandwagoner.

[-] skymtf@lemmy.blahaj.zone 2 points 1 year ago

I feel like all them do, have you seen wayze nearly getting black people killed cause it didn't stop for s cop. And it can't recognize construction zones.

[-] sky@codesink.io -2 points 1 year ago

Five LiDAR sensors hasn't stopped Cruise from running into a bus, multiple cars, and a fire truck. Maybe self-driving is a myth?

Maybe we should just build buses and trains and pay people good salaries to operate them??

this post was submitted on 03 Sep 2023
209 points (89.7% liked)

Technology

59583 readers
3602 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS