9

A 2025 Tesla Model 3 in Full-Self Driving mode drives off of a rural road, clips a tree, loses a tire, flips over, and comes to rest on its roof. Luckily, the driver is alive and well, able to post about it on social media.

I just don't see how this technology could possibly be ready to power an autonomous taxi service by the end of next week.

you are viewing a single comment's thread
view the rest of the comments
[-] Buffalox@lemmy.world 2 points 2 months ago

The car made a fatal decision faster than any human could possibly correct it. Tesla’s idea that drivers can “supervise” these systems is, at this point, nothing more than a legal loophole.

What I don't get is how this false advertising for years hasn't caused Tesla bankruptcy already?

[-] echodot@feddit.uk 2 points 2 months ago

Because the US is an insane country where you can straight up just break the law and as long as you're rich enough you don't even get a slap on the wrist. If some small startup had done the same thing they'd have been shut down.

What I don't get is why teslas aren't banned all over the world for being so fundamentally unsafe.

[-] Buffalox@lemmy.world 1 points 2 months ago

What I don’t get is why teslas aren’t banned all over the world for being so fundamentally unsafe.

I've argued this point the past year, there are obvious safety problems with Tesla, even without considering FSD.
Like blinker on the steering wheel, manual door handles that are hard to find in emergencies, and distractions from common operations being behind menus on the screen, instead of having directly accessible buttons. With auto pilot they also tend to break for no reason, even on autobahn with clear road ahead! Which can also create dangerous situations.

[-] NikkiDimes@lemmy.world 0 points 2 months ago

Well, because 99% of the time, it's fairly decent. That 1%'ll getchya tho.

[-] ayyy@sh.itjust.works 2 points 2 months ago

To put your number into perspective, if it only failed 1 time in every hundred miles, it would kill you multiple times a week with the average commute distance.

[-] KayLeadfoot@fedia.io 2 points 2 months ago

Someone who doesn't understand math downvoted you. This is the right framework to understand autonomy, the failure rate needs to be astonishingly low for the product to have any non-negative value. So far, Tesla has not demonstrated non-negative value in a credible way.

[-] bluewing@lemm.ee -1 points 2 months ago* (last edited 2 months ago)

You are trying to judge the self driving feature in a vacuum. And you can't do that. You need to compare it to any alternatives. And for automotive travel, the alternative to FSD is to continue to have everyone drive manually. Turns out, most clowns doing that are statistically worse at it than even FSD, (as bad as it is). So, FSD doesn't need to be perfect-- it just needs to be a bit better than what the average driver can do driving manually. And the last time I saw anything about that, FSD was that "bit better" than you statistically.

FSD isn't perfect. No such system will ever be perfect. But, the goal isn't perfect, it just needs to be better than you.

[-] echodot@feddit.uk 2 points 2 months ago

FSD isn't perfect. No such system will ever be perfect. But, the goal isn't perfect, it just needs to be better than you.

Yeah people keep bringing that up as a counter arguement but I'm pretty certain humans don't swerve off a perfectly straight road into a tree all that often.

So unless you have numbers to suggest that humans are less safe than FSD then you're being equally obtuse.

[-] jamesjams@lemmy.world 2 points 2 months ago

Humans do swerve off perfectly straight roads into trees, I know because I've done it!

[-] echodot@feddit.uk 2 points 2 months ago

Can you confirm that to the best of your knowledge you are not a robot?

[-] KayLeadfoot@fedia.io 2 points 2 months ago

This little subthread looks like this.

[-] bluewing@lemm.ee 1 points 2 months ago

A simple google search, (which YOU could have done yourself), shows it's abut 1 in 1.5 million miles driven per accident with FSD vs 1 in 700,000 miles driven for mechanical cars. I'm no Teslastan, (I think they are over priced and deliberately for rich people only), but that's an improvement, a noticeable improvement.

And as a an old retired medic who has done his share of car accidents over nearly 20 years-- Yes, yes humans swerve off of perfectly straight roads and hit trees and anything else in the way also. And do so at a higher rate.

[-] FreedomAdvocate@lemmy.net.au -1 points 2 months ago

What is the failure rate? Unless you know that you can’t make that claim.

[-] NikkiDimes@lemmy.world 1 points 2 months ago

..It absolutely fails miserably fairly often and would likely crash that frequently without human intervention, though. Not to the extent here, where there isn't even time for human intervention, but I frequently had to take over when I used to use it (post v13)

[-] Buffalox@lemmy.world 1 points 2 months ago

Many Tesla owners are definitely dead many times, on the inside.

[-] echodot@feddit.uk 1 points 2 months ago

That's probably not the failure rate odds but a 1% failure rate is several thousand times higher than what NASA would consider an abort risk condition.

Let's say that it's only 0.01% risk, that's still several thousand crashes per year. Even if we could guarantee that all of them would be non-fatal and would not involve any bystanders such as pedestrians the cost of replacing all of those vehicles every time they crashed plus fixing damage of things they crashed into, lamp posts, shop Windows etc would be so high as it would exceed any benefit to the technology.

It wouldn't be as bad if this was prototype technology that was constantly improving, but Tesla has made it very clear they're never going to add lidar scanners so is literally never going to get any better it's always going to be this bad.

[-] KayLeadfoot@fedia.io 1 points 2 months ago

...is literally never going to get any better it's always going to be this bad.

Hey now! That's unfair. It is constantly changing. Software updates introduce new reversions all the time. So it will be this bad, or significantly worse, and you won't know which until it tries to kill you in new and unexpected ways :j

[-] FreedomAdvocate@lemmy.net.au -1 points 2 months ago

Saying it’s never going to get better is ridiculous and demonstrably wrong. It has improved in leaps and bounds over generations. It doesn’t need LiDAR.

The biggest thing you’re missing if that with FSD **the driver is still supposed to be paying attention at all times, ready to take over like a driving instructor does when a learner is doing something dangerous. Just because it’s in FSD Supervised mode it slant mean you should just sit back and watch it drive you off the road into a lake.

[-] FreedomAdvocate@lemmy.net.au -1 points 2 months ago

What false advertising? It’s called “Full Self Driving (Supervised)”.

[-] Buffalox@lemmy.world 0 points 2 months ago* (last edited 2 months ago)

For many years the "supervised" was not included, AFAIK Tesla was forced to do that.
And in this case "supervised" isn't even enough, because the car made an abrupt unexpected maneuver, instead of asking the driver to take over in time to react.

[-] FreedomAdvocate@lemmy.net.au -1 points 2 months ago

The driver isn’t supposed to wait for the car to tell them to take over lol. The driver is supposed to take over when necessary.

[-] Buffalox@lemmy.world 0 points 2 months ago* (last edited 2 months ago)

No if you look at Waymo as an example, they are actually autonomous, and stop to ask for assistance in situations they are "unsure" how to handle.

But even if you claim was true, in what way was this a situation where the driver could deem it necessary to take over? It was clear road ahead, and nothing in view to indicate any kind of problem, when the car made a sudden abrupt left causing it to roll upside down.

[-] FreedomAdvocate@lemmy.net.au 0 points 2 months ago* (last edited 2 months ago)

They can’t stop and ask for assistance at 100km/h on a highway.

I hope Tesla/Musk address this accident and get the telemetry from the car, cause there’s no evidence that FSD was even on.

[-] Buffalox@lemmy.world 0 points 2 months ago

According to the driver it was on FSD, and it was using the latest software update available.

https://www.reddit.com/user/SynNightmare/

They can’t stop and ask for assistance at 100km/h on a highway.

Maybe the point is then, that Tesla FSD shouldn't be legally used on a highway.
But it probably shouldn't be used anywhere, because it's faulty as shit.
And why can't is slow down to let the driver take over in a timely manner, when it can break for no reason.
It was tested in Germany on Autobahn where it did that 8 times within 6 hours!!!

[-] FreedomAdvocate@lemmy.net.au -3 points 2 months ago* (last edited 2 months ago)

According to the driver, with zero evidence backing up the claim. With how much of a hard on everyone has for blaming Elon musk for everything, and trying to drag teslas stock down, his accident is a sure fire way to thousands of Internet karma and e-fame on sites like Reddit and Lemmy. Why doesn’t he just show us the interior camera?

Looking at his profile he’s milking this for all it’s worth - he’s posted the same thread to like 8 different subs lol. He’s karma whoring. He probably wasn’t even the one involved in the crash.

Looked at his twitter which he promoted on there too, and of course he tags mark rober and is retweeting everything about this crash. He’s loving the attention and doing everything he can to get more.

Also he had the car for less than 2 weeks and said he used FSD “all the time”……in a brand new car he’d basically never driven…..and then it does this catastrophic failure? Yeh nah lol. Also as others in some of the threads have pointed out, the version of FSD he claims it was on wasn’t out at the time of his accident.

Dudes lying through his teeth.

[-] Buffalox@lemmy.world 0 points 2 months ago

There have been other similar cases lately, which clearly indicate problems with the car.
The driver has put up the footage from all the cameras of the car, so he has done what he can to provide evidence.

https://www.reddit.com/r/TeslaFSD/comments/1ksa79y/1328_fsd_accident/

It's very clear from the comments, that some have personally experienced similar things, and others have seen reporting of it.
This is not an isolated incident. It's just has better footage than most.

[-] FreedomAdvocate@lemmy.net.au -2 points 2 months ago

Also this happened in February. He never reached out to Tesla? He never requested the data to show that FSD was engaged? In that thread he says he only just did it. There’s also an official Tesla software program you can use to get the full logs, but as expected he hasn’t done that.

Dudes lying for sure.

[-] Buffalox@lemmy.world 1 points 2 months ago* (last edited 2 months ago)

February explains why he wasn't on 13.2.9.
Why would he reach out to Tesla? That's not his job, but the insurance.
But there is no point, because Tesla never takes responsibility in these cases.

[-] FreedomAdvocate@lemmy.net.au -2 points 2 months ago

It also means that he couldn't have been on version 13.2.8 which he claims he was on though.

https://www.reddit.com/r/TeslaFSD/comments/1ksa79y/1328_fsd_accident/mtlj5ki/

My point in saying he didn't reach out to tesla is that if I owned a car that drove itself off the road into a tree, I'd reach out to tesla and ask them to investigate and see what they'll do for me for almost killing me. Insurance is a completely different story, they'll go and do their thing regardless.

[-] Buffalox@lemmy.world 0 points 2 months ago* (last edited 2 months ago)

You are so full of shit, I just checked it out:

https://www.reddit.com/r/technology/comments/1kskfqd/comment/mtmbkvm/?context=3

Ay that’s me thank you for tagging me. I know that there’s a lot of skepticism about my accident. I leased the car at the beginning of February and this happened at the end of February. I was using FSD every chance it would let me. I did not have time to react the cop said it was going 55 miles when it crashed me. I requested the data log today as somebody suggested to me.

He never claimed it was recent.

Every claim you make you never provide sources, because you are probably just parroting hearsay.

[-] FreedomAdvocate@lemmy.net.au -2 points 2 months ago* (last edited 2 months ago)

He never claimed it was recent.

Did you not even read my post? Literally THE FIRST SENTENCE OF MY POST IS THIS:

Also this happened in February.

The part of mine that you misunderstood as me saying it was recent was, I assume, this:

He never requested the data to show that FSD was engaged? In that thread he says he only just did it.

Again, you misread and misunderstood - a common theme with you apparently - me saying that he only just requested the data from Tesla as me saying "he only just crashed the car". The quote you posted of the guy literally confirms what I said - that the crash was in February and that he only just requested the data lol

You are so full of shit, I just checked it out:

Every claim you make you never provide sources, because you are probably just parroting hearsay.

Care to apologize?

[-] FreedomAdvocate@lemmy.net.au -2 points 2 months ago

Just no footage from the interior camera, no proof of FSD being used.

Others have pointed out critical holes in his story - namely that he claims that he was on a version of FSD that was not released at the time of his crash.

[-] Buffalox@lemmy.world 0 points 2 months ago

The link I gave you is the place he posted this. And you can see what version he says he was using:

I find it entertaining honestly there is so many conspiracy theories around my incident people saying it’s not a Tesla, the robo taxi drama, both political sides think I’m anti Elon, others saying that I wasn’t on 13.2.8 I definitely put mine on the beta channel to get that version as quick as I did

https://www.notateslaapp.com/fsd-beta/
So you are parroting bullshit, the current version is 13.2.9.

Just no footage from the interior camera, no proof of FSD being used.

Funny how people in the thread I linked to you, who drive Tesla themselves don't question this?
Some people believe the FSD saw the shadow of the pole as a curb in the road, or maybe even the base of a wall. And that's why the FSD decided to "evade".
There are plenty examples in the comments from people who drive Tesla themselves, about how it steers into oncoming traffic, one describes how his followed black skid marks in the road wearing wildly left to right, another describes how his made an evasive maneuver because of a patch in the road. It just goes on and on with how faulty FSD is.

IDK what Tesla cars have what cameras. But I've seen plenty reporting on Tesla FSD, and none of it is good.
So why do you believe it's more likely to be human error? When if it was a human not paying attention, it would be much more likely to weer slowly. rather than making an abrupt idiotic maneuver?

To me it seems you are the one who lacks evidence in your claims.
And problem with Tesla logging is that it's a proprietary system that only Tesla has access to, that system needs to be open for everybody to examine.

[-] FreedomAdvocate@lemmy.net.au -1 points 2 months ago* (last edited 2 months ago)

The link I gave you is the place he posted this. And you can see what version he says he was using:

What he says is what's questionable though, because he says he was on 13.2.8 but that wasn't available to the public at the time of his crash. Being on the beta channel means nothing, he still wouldn't have gotten it because it wasn't available on the beta channel on that day.

https://www.reddit.com/r/TeslaFSD/comments/1ksa79y/1328_fsd_accident/mtlj5ki/

So you are parroting bullshit, the current version is 13.2.9.

How am I parroting bullshit? The current version is 13.2.9? Great - has nothing to do with what I said.

Funny how people in the thread I linked to you, who drive Tesla themselves don’t question this?

Some of them did, because he says he was on 13.2.8 which added different camera recordings which aren't available on his car. He clearly didn't know this when he lied to say he was on 13.2.8, because if he was on it then these recordings would have been there. Reddit being reddit, and the internet being the internet - people lie. Most of the people in there claiming that their teslas tried to do the same thing are likely 12 year olds who've never driven a car before. Don't believe everything you read on the internet, ESPECIALLY on reddit lol.

Also if you haven't realized, hating on anything tesla/musk related is the cool thing to do at the moment, especially people on left-leaning sites like Reddit and Lemmy.

Some people believe the FSD saw the shadow of the pole as a curb in the road, or maybe even the base of a wall. And that’s why the FSD decided to “evade”.

Cool, doesn't mean they're right. The FSD saw shadows of poles and trees and power lines all the way up to that second too. Well it would have if it was engaged.

To me it seems you are the one who lacks evidence in your claims.

My claim is that it wasn't using FSD because he has provided ZERO evidence that he WAS using FSD. I'm the one asking for evidence to support the claim, and he has provided NONE. Remember - I'm not the one making the initial claim. I'm the one asking for evidence to support the claim.

Even if I was the one that needed to provide evidence, I have done so - lied about the version. No interior footage to prove it wasn't driver error. Didn't reach out to tesla about their car trying to kill him.

And problem with Tesla logging is that it’s a proprietary system that only Tesla has access to, that system needs to be open for everybody to examine.

Tesla have a program you can download to review your logs supposedly.

this post was submitted on 23 May 2025
9 points (100.0% liked)

Technology

73700 readers
1209 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS