112

In the piece — titled "Can You Fool a Self Driving Car?" — Rober found that a Tesla car on Autopilot was fooled by a Wile E. Coyote-style wall painted to look like the road ahead of it, with the electric vehicle plowing right through it instead of stopping.

The footage was damning enough, with slow-motion clips showing the car not only crashing through the styrofoam wall but also a mannequin of a child. The Tesla was also fooled by simulated rain and fog.

you are viewing a single comment's thread
view the rest of the comments
[-] FuglyDuck@lemmy.world 11 points 2 months ago

As Electrek points out, Autopilot has a well-documented tendency to disengage right before a crash. Regulators have previously found that the advanced driver assistance software shuts off a fraction of a second before making impact.

This has been known.

They do it so they can evade liability for the crash.

[-] bazzzzzzz@lemm.ee 2 points 2 months ago

Not sure how that helps in evading liability.

Every Tesla driver would need super human reaction speeds to respond in 17 frames, 680ms(I didn't check the recording framerate, but 25fps is the slowest reasonable), less than a second.

[-] FuglyDuck@lemmy.world 2 points 2 months ago

It’s not likely to work, but them swapping to human control after it determined a crash is going to happen isn’t accidental.

Anything they can do to mire the proceedings they will do. It’s like how corporations file stupid junk motions to force plaintiffs to give up.

[-] fibojoly@sh.itjust.works 1 points 2 months ago* (last edited 2 months ago)

That makes so little sense... It detects it's about to crash then gives up and lets you sort it?
That's like the opposite of my Audi who does detect I'm about to hit something and gives me either a warning or just actively hits the brakes if I don't have time to handle it.
If this is true, this is so fucking evil it's kinda amazing it could have reached anywhere near prod.

[-] Red_October@lemmy.world 3 points 2 months ago

The point is that they can say "Autopilot wasn't active during the crash." They can leave out that autopilot was active right up until the moment before, or that autopilot directly contributed to it. They're just purely leaning into the technical truth that it wasn't on during the crash. Whether it's a courtroom defense or their own next published set of data, "Autopilot was not active during any recorded Tesla crashes."

[-] FuglyDuck@lemmy.world 1 points 2 months ago* (last edited 2 months ago)

even your audi is going to dump to human control if it can't figure out what the appropriate response is. Granted, your Audi is probably smart enough to be like "yeah don't hit the fucking wall," but eh.... it was put together by people that actually know what they're doing, and care about safety.

Tesla isn't doing this for safety or because it's the best response. The cars are doing this because they don't want to pay out for wrongful death lawsuits.

If this is true, this is so fucking evil it’s kinda amazing it could have reached anywhere near prod.

It's musk. he's fucking vile, and this isn't even close to the worst thing he's doing. or has done.

[-] NotMyOldRedditName@lemmy.world 1 points 2 months ago* (last edited 2 months ago)

Any crash within 10s of a disengagement counts as it being on so you can't just do this.

Edit: added the time unit.

Edit2: it's actually 30s not 10s. See below.

[-] FuglyDuck@lemmy.world 0 points 2 months ago

Where are you seeing that?

There’s nothing I’m seeing as a matter of law or regulation.

In any case liability (especially civil liability) is an absolute bitch. It’s incredibly messy and likely will not every be so cut and dry.

[-] NotMyOldRedditName@lemmy.world 2 points 2 months ago* (last edited 2 months ago)

Well it's not that it was a crash caused by a level 2 system, but that they'll investigate it.

So you can't hide the crash by disengaging it just before.

Looks like it's actually 30s seconds not 10s, or maybe it was 10s once upon a time and they changed it to 30?

The General Order requires that reporting entities file incident reports for crashes involving ADS-equipped vehicles that occur on publicly accessible roads in the United States and its territories. Crashes involving an ADS-equipped vehicle are reportable if the ADS was in use at any time within 30 seconds of the crash and the crash resulted in property damage or injury

https://www.nhtsa.gov/sites/nhtsa.gov/files/2022-06/ADAS-L2-SGO-Report-June-2022.pdf

[-] FuglyDuck@lemmy.world 1 points 2 months ago

Thanks for that.

The thing is, though the NHTSA generally doesn't make a determination on criminal or civil liability. They'll make the report about what happened and keep it to the facts, and let the courts sort it out whose at fault. they might not even actually investigate a crash unless it comes to it. It's just saying "when your car crashes, you need to tell us about it." and they kinda assume they comply.

Which, Tesla doesn't want to comply, and is one of the reasons Musk/DOGE is going after them.

[-] NotMyOldRedditName@lemmy.world 1 points 2 months ago* (last edited 2 months ago)

I knew they wouldn't necessarily investigate it, that's always their discretion, but I had no idea there was no actual bite to the rule if they didn't comply. That's stupid.

[-] Simulation6@sopuli.xyz 0 points 2 months ago

If the disengage to avoid legal consequences feature does exist, then you would think there would be some false positive incidences where it turns off for no apparent reason. I found some with a search, which are attributed to bad software. Owners are discussing new patches fixing some problems and introducing new ones. None of the incidences caused an accident, so maybe the owners never hit the malicious code.

[-] FuglyDuck@lemmy.world 1 points 2 months ago

if it randomly turns off for unapparent reasons, people are going to be like 'oh that's weird' and leave it at that. Tesla certainly isn't going to admit that their code is malicious like that. at least not until the FBI is digging through their memos to show it was. and maybe not even then.

[-] RampantParanoia2365@lemmy.world 0 points 2 months ago

If it knows it's about to crash, then why not just brake?

[-] GoodLuckToFriends@lemmy.today 1 points 2 months ago

Because even braking can't avoid the crash. Unavoidable crash means bad juju if the 'self driving' car image is meant to stick around.

[-] NotMyOldRedditName@lemmy.world 1 points 2 months ago* (last edited 2 months ago)

AEB braking was originally designed to not prevent a crash, but to slow the car when a unavoidable crash was detected.

It's since gotten better and can also prevent crashes now, but slowing the speed of the crash was the original important piece. It's a lot easier to predict an unavoidable crash, than to detect a potential crash and stop in time.

Insurance companies offer a discount for having any type of AEB as even just slowing will reduce damages and their cost out of pocket.

Not all AEB systems are created equal though.

Maybe disengaging AP if an unavoidable crash is detected triggers the AEB system? Like maybe for AEB to take over which should always be running, AP has to be off?

[-] Trainguyrom@reddthat.com 1 points 2 months ago

Breaks require a sufficient stopping distance given the current speed, driving surface conditions, tire condition, and the amount of momentum at play. This is why trains can't stop quickly despite having breaks (and very good ones at that, with air breaks on every wheel) as there's so much momentum at play.

If autopilot is being criticized for disengaging immediately before the crash, it's pretty safe to assume its too late to stop the vehicle and avoid the collision

[-] filcuk@lemmy.zip 1 points 2 months ago

This autopilot shit needs regulated audit log in a black box, like what planes or ships have.
In no way should this kind of manipulation be legal.

[-] sober_monk@lemmy.world 0 points 2 months ago

The self-driving equivalent of "Jesus take the wheel!"

this post was submitted on 19 Mar 2025
112 points (98.3% liked)

Not The Onion

16522 readers
508 users here now

Welcome

We're not The Onion! Not affiliated with them in any way! Not operated by them in any way! All the news here is real!

The Rules

Posts must be:

  1. Links to news stories from...
  2. ...credible sources, with...
  3. ...their original headlines, that...
  4. ...would make people who see the headline think, “That has got to be a story from The Onion, America’s Finest News Source.”

Please also avoid duplicates.

Comments and post content must abide by the server rules for Lemmy.world and generally abstain from trollish, bigoted, or otherwise disruptive behavior that makes this community less fun for everyone.

And that’s basically it!

founded 2 years ago
MODERATORS