210
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 03 Sep 2023
210 points (100.0% liked)
Technology
37708 readers
301 users here now
A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.
Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.
Subcommunities on Beehaw:
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
founded 2 years ago
MODERATORS
Fine by me, as long as the companies making the cars take all responsibility for accidents. Which, you know, the human drivers do.
But the car companies want to sell you their shitty autonomous driving software and make you be responsible.
If they don't trust it enough, why should I?
But do they really? If so, why's there the saying "if you want to murder someone, do it in a car"?
I do think self-driving cars should be held to a higher standard than humans, but I believe the fundamental disagreement is in precisely how much higher.
While zero incidents is naturally what they should be aiming for, it's more of a goal for continuous improvement, like it is for air travel.
What liability can/should we place on companies that provide autonomous drivers that will ultimately lead to safer travel for everyone?
Well, the laws for sure aren't perfect, but people are responsible for the accidents they cause. Obviously there are plenty of exceptions, like rich people, but if we're talking about the ideal real-life scenario, there are consequences for causing an accident. Whether those consequences are appropriate or not is for another discussion.
As far as I know, proper self driving (not "autopilot") AVs are pretty close to zero incidents if you only count crashes where they are at fault.
When another car runs a red light and smashes into the side of an autonomous vehicle at 40mph... it wasn't the AV's fault. Those crashes should not be counted and as far as I know they currently are in most stats.
I'm fine with exactly the same liability as human drivers have. Unlike humans, who are motivated to drive dangerously for fun or get home when they're high on drugs or continue driving through the night without sleep to avoid paying for a hotel, autonomous vehicles have zero motivation to take risks.
In the absence of that motivation, the simple fact that insurance against accidents is expensive is more than enough to encourage these companies to continue to invest in making their cars safer. Because the safer the cars, the lower their insurance premiums will be.
Globally insurance against car accidents is approaching half a trillion dollars per year and increasing over time. With money like that on the line, why not spend a lazy hundred billion dollars or so on better safety? It won't actually cost anything - it will save money.
That... almost makes it sound like the main opposition to autonomous cars, would be insurance companies: can't earn more by raising the premiums, if there are no accidents and a competing insurance company can offer a much cheaper insurance.
Well you shouldn't trust it and the car company tells you this. It's not foolproof and something to be blindly relied on. It's a system that assists driving but doesn't replace the driver. Not in it's current form atleast though they may be getting close.
Then what's the discussion even about? I don't want autonomous cars on the street because even their creators don't trust them to make it.
Most people consider cruise control to be quite useful feature thought it still requires you to pay attention that you stay on your lane and don't run into a slower vehicle in front of you. You can then keep adding features such as radar for adaptive cruise control and lane assist and this further decreases the stuff you need to pay attention to but you still need to sit there behind the wheel watching the road. These self-driving systems at their current form are no different. They're just further along the spectrum towards self driving. Some day we will reach the point that you sitting on the driver's seat just introduces noise to the system so better you go take a nap on the back seat. We're not there yet however. This is still just super sophisticated cruise control.
It's kind of like with chess engines. First humans are better at it than computers. Then computer + human is better than just the computer and then at some point the human is no longer needed and computer will from there on always be better.
I don't feel like this is what we were talking about - at least I was talking about cars that drive alone.
Well Cruise is offering a full self driving taxi service where they don't mandate you as a passenger to pay attention to the traffic and take control if needed so it's not fair to say that they don't trust it so why should you.
With Tesla however this is the case but despite their rather aggresive marketing they still make it very clear that this is not finished yet and you are allowed to use it but you're still the driver and the safe use of it is on your responsibility. That's the case with the beta version of any software; you get it early which is what early adopters like but you're expected to encounter bugs and this is the trade-off you have to accept.
Is the company legally liable for the actions of the self driving car? If no, then they don't trust the vehicles.
What charges would apply against a human that delayed an emergency vehicle and caused someone to die?
There's several court cases ongoing about this stuff and I'd be surprised if these companies didn't have any liability
That's a moved goalpost, and you know it.
If liability is forced on them, that is a huge difference from them voluntarily accepting responsibility. That is what would indicate that they trusted the service they provided.
I think the issue here is that you like many other people seem to imagine that because a system is called "full self driving" it literally means that. As if it's either fully human controlled or fully AI controlled and there's no inbetween. No, this is just overly simplified black and white thinking that misses all the nuances about the subject.
This is utter nonsense. These companies aren't not-liable for the accidents they cause. Ofcourse they don't want to be and would rather swipe these incidents under the rug but that's just not going to happen. There howerer just isn't a precedent. This is brand new technology that no one has seen before. What the liability of these companies is going to be the end is still under debate. It's just a blatant lie at this point to claim they have no liability as if that's something that's been settled.
Jesus, You're like a ChatGPT bot.
Given your prolific posting in this thread, the bullshit about not wanting a FSD vehicle yourself 'because you like driving', and the frankly intentional misreading of what I write as a direct response to what you're saying, I have to ask how long you've been in the employ of some car manufacturer?
Also, I'm sick of your patronizing tone:
I have an educational background in mathematics, and I'm fairly certain I understand the math of Neural Networks better than you do. I also am aware of their shortcomings.
I hate people that assume that people that disagree with them are paid or bots. But when you make as many intentional bad-faith arguments combined with your asshole patronizing tone, I can't help but believe that you're a fucking paid astro-turfing asshole.
Well, considering how angry you're getting and the fact that you're now resorting to mind-reading, ad-hominem attacks, and plain old name-calling, I can definitely tell you're not a chatGPT bot, lol.
However, since you're not making any further attempts to convince me otherwise, and I don't feel like repeating the same point, I take that the debate is over then. Have a nice day, and thanks for the discussion. Pleasure seems to be on my side.
The discussed incident does not involve driving assist systems, driverless autonomous taxis are already on the streets: