164
submitted 2 years ago by L4s@lemmy.world to c/technology@lemmy.world

Virginia sheriff's office says Tesla was running on Autopilot moments before tractor-trailer crash::Virginia authorities have determined that a Tesla was operating on its Autopilot system and was speeding in the moments leading to a crash with a crossing tractor-trailer last July that killed the Tesla driver. The death of Pablo Teodoro III, 57, is the third since 2016 in which a Tesla that was using Autopilot ran underneath a crossing tractor-trailer, raising questions about the partially automated system's safety and where it should be allowed to operate. The crash south of Washington remains under investigation by the U.S. National Highway Traffic Safety Administration, which sent investigators to Virginia last summer and began a broader probe of Autopilot more than two years ago.

all 24 comments
sorted by: hot top controversial new old
[-] 1984@lemmy.today 39 points 2 years ago

Honestly I think it's people who are trusting these systems way too much. They think it's safe because of marketing.

[-] Fades@lemmy.world 25 points 2 years ago

No coincidence Tesla is attacking those that are trying to hold them accountable for false advertising. They claim it infringes upon their 1A rights, fucking capitalist pigs

[-] kurcatovium@lemm.ee 9 points 2 years ago

Well, people wouldn't trust it if there wasn't lying marketing in the first place, kind of.

[-] asdfasdfasdf@lemmy.world -1 points 2 years ago

There's a huge warning sign when you turn on autopilot saying you need to always be paying attention and ready to take over at any time, and you need to acknowledge this before it turns it on. It also checks periodically that you're in control of the wheel, or else will disengage.

The reasont these people crash is that they're morons.

[-] Toine@sh.itjust.works 6 points 2 years ago* (last edited 2 years ago)

It also checks periodically that you're in control of the wheel, or else will disengage.

Disengaging autopilot when it detects you are not in control of the wheel sounds a bit dangerous.

[-] Riven@lemmy.dbzer0.com 4 points 2 years ago

Honda has lane assist and adaptive cruise control. It also disengage both of them if it doesn't detect driver feedback after a while.

I tested it when alone on a long straight empty freeway out of curiosity for anyone wondering. I did have my hands hovering the wheel ready tk take control just in case but otherwise didn't touch it after I engaged both systems.

Did a couple tests and it disengaged them every time after a little while.

[-] BearOfaTime@lemm.ee 1 points 2 years ago

It alerts you. You'd have to be asleep to not get the alerts.

In cars with adaptive cruise, lane keeping, etc, it'll beep and flash on the dash if you're not steering enough (I have a car that complains all the time because I don't grip the wheel like an ape).

Some will shake the wheel too. Or slow the car down if you don't respond.

[-] asdfasdfasdf@lemmy.world -1 points 2 years ago
[-] PipedLinkBot@feddit.rocks 2 points 2 years ago

Here is an alternative Piped link(s):

https://www.piped.video/watch?v=oBIKikBmdN8

Piped is a privacy-respecting open-source alternative frontend to YouTube.

I'm open-source; check me out at GitHub.

[-] DoomBot5@lemmy.world 22 points 2 years ago

Tesla's biggest downfall is probably going to be the fact that they called this system Autopilot. It still requires drivers to pay attention. These drivers still treat it as a driver replacement then everyone wonders why they crash.

[-] RememberTheApollo@lemmy.world 14 points 2 years ago

And make you pay for “Full Self Drive”, another allusion to a fully self-driving vehicle.

[-] set_secret@lemmy.world 3 points 2 years ago* (last edited 2 years ago)

this has to be a massive class action waiting to happen imo. it's not autopilot and it sure as shit isn't full self driving. lies need to be held accountable by Tesla.

[-] mspencer712@programming.dev 6 points 2 years ago

Agreed. They are deliberately taking advantage of the fact that people don’t understand how autopilot is actually used in aircraft.

Sure, the most pedantic of us will point out that, with autopilot enabled, the pilot-flying is still in command of the aircraft and still responsible for the safe conduct of the flight. Pilots don’t** engage autopilot and then leave the cockpit unattended. They prepare for the next phase of flight, monitor their surroundings, prepare for top-of-descent, and to stay mentally ahead of the rapid-fire events and requirements for a safe approach and landing. Good pilots let the autopilot free them up for other tasks, while always preparing for the very real possibility that the autopilot will malfunction in the most lethal way possible at the worst possible moment.

Do non-pilots understand that? No. The parent poster is absolutely correct: Tesla is taking advantage of peoples’ misunderstanding, and then hiding behind pedantic truth about what a real autopilot is actually for.

** Occasionally pilots do, and many times something goes horribly wrong unexpectedly and they die. Smart, responsible pilots don’t. Further, sometimes pilots fail to manage their autopilot correctly, or use it without understanding how it can behave when something goes wrong. (RIP to aviation Youtuber TNFlygirl who had a fatal accident six days ago, suspected to be due to mismanagement of an unfamiliar autopilot system.)

[-] RememberTheApollo@lemmy.world 6 points 2 years ago

Pilots, at least at the upper echelons, have it drilled into them that they are responsible for the aircraft, their actions in it, and those aboard it. I cannot stress the difference between the casual attitude the vast majority of people view their actions behind the wheel with vs the attitude and responsibility of operating a complex commercial aircraft.

Autopilot is a generally necessary convenience for operation of aircraft on long flights, for efficiency, comfort, and preventing fatigue…but it gets turned off instantly should safety require it and conditions warrant it.

In a car? People use it for reading, watching video clips, dozing off if they can get away with it, and letting it drive them right into or cause a wreck.

The problem isn’t necessarily the system (though Tesla’s FSD is full of problems), it’s the fact that drivers are willfully dumbasses with no real understanding of their car’s system and their responsibilities regarding them.

[-] AA5B@lemmy.world 2 points 2 years ago

Plus there are multiple levels of autopilot. The plane I flew had a half-ass single axis that was usually not worth using, although maybe when things were hectic it could help reduce workload slightly

[-] DoomBot5@lemmy.world 1 points 2 years ago

Your disclaimer basically describes all these Tesla fatalities just the same. You just substituted Tesla with aircraft.

[-] lemann@lemmy.one 1 points 2 years ago

Is Tesla still training the Autopilot neural network in 3D worlds, or are they now entirely relying on driver data?

this post was submitted on 13 Dec 2023
164 points (98.2% liked)

Technology

72652 readers
1086 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS