651
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 30 Oct 2024
651 points (89.4% liked)
Technology
59583 readers
3442 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
I know a lot of people here are/will be mad at Musk simply for personal political disagreement, but even just putting that aside, I've never liked the idea of self-driving cars. There's just too much that can go wrong too easily, and in a 1-ton piece of metal and glass moving at speeds up to near 100 mph, you need to be able to have the control enough to respond within a few seconds if the unexpected happens, like a deer jumping in the middle of the road. Computers don't, and may never, have the benefit of contextual awareness to make the right decision as often as a human would in those situations. I'm not going to cheer for the downfall of Musk or Tesla as a whole, but they do severely need to reconsider this idea or else there will be a lot of people hurt and/or killed and a lot of liability on them when it happens. That's a lot of risk to take on for a smaller auto maker like them, just thinking in business terms.
I mean we do let humans drive cars and some of them are as dumb as bricks and some are malicious little freaks.
Not saying we are anywhere FSD and Elon is a clown, but I would support a future with this technology if we ever got there. The issue is we would have to be all or nothing. Like you can’t have a mix of robots and people driving around.
The problem is that with dumb drivers you can easily place blame at the driver and make him pay for his idiocracy. FSD is a lot more complicated. You can't really blame the driver since he wasn't driving the car but neither did the engineer or the company itself. We'd have to draw up entirely new frameworks in order to define and place criminal neglect if one should exist. Is the company responsible for a malicious developer? Is the company responsible for a driver ignoring a set guideline and sits impaired behind the emergency stop? Is the driver responsible for a software fault?
All of these questions and many more needs to be answered. Some probably can't and must remain a so-called "act of God" with no blame to place. And people is not fond of blaming just the software, they're out for blood when an accident happens and software don't bleed. Of course the above questions might be the easiest to answer but the point still stands.
Full self driving should only be implemented when the system is good enough to completely take over all driving functions. It should only be available in vehicles without steering wheels. The Tesla solution of having "self driving" but relying on the copout of requiring constant user attention and feedback is ridiculous. Only when a system is truly capable of self-driving 100% autonomously, at a level statistically far better than a human, should any kind of self-driving be allowed on the road. Systems like Tesla's FSD officially require you to always be ready to intervene at a moment's notice. They know their system isn't ready for independent use yet, so they require that manual input. But of course this encourages disengaged driving; no one actually pays attention to the road like they should, able to intervene at a moment's notice. Tesla's FSD imitates true self-driving, but it pawns off the liability do drivers by requiring them to pay attention at all times. This should be illegal. Beyond merely lane-assistance technology, no self-driving tech should be allowed except in vehicles without steering wheels. If your AI can't truly perform better than a human, it's better for humans to be the only ones actively driving the vehicle.
This also solves the civil liability problem. Tesla's current system has a dubious liability structure designed to pawn liability off to the driver. But if there isn't even a steering wheel in the car, then the liability must fall entirely on the vehicle manufacturer. They are after all 100% responsible for the algorithm that controls the vehicle, and you should ultimately have legal liability for the algorithms you create. Is your company not confident enough in its self-driving tech to assume full legal liability for the actions of your vehicles? No? Then your tech isn't good enough yet. There can be a process for car companies to subcontract out the payment of legal claims against the company. They can hire State Farm or whoever to handle insurance claims against them. But ultimately, legal liability will fall on the company.
This also avoids criminal liability. If you only allow full self-driving in vehicles without steering wheels, there is zero doubt about who is control of the car. There isn't a driver anymore, only passengers. Even if you're a person sitting in the seat that would normally be a driver's seat, it doesn't matter. You are just a passenger legally. You can be as tired, distracted, drunk, or high as you like, you're not getting any criminal liability for driving the vehicle. There is such a clear bright line - there is literally no steering wheel - that it is absolutely undeniable that you have zero control over the vehicle.
This actually would work under the same theory of existing drunk-driving law. People can get ticketed for drunk driving for sleeping in their cars. Even if the cops never see you driving, you can get charged for drunk driving if they find you in a position where you could drunk drive. So if you have your keys on you while sleeping drunk in a parked car, you can get charged with DD. But not having a steering wheel at all would be the equivalent of not having the keys to a vehicle - you are literally incapable of operating it. And if you are not capable of operating it, you cannot be criminally liable for any crime relating to its operation.
An FSD car that makes perfect decisions would theoretically be safer than a human driver who also makes perfect decisions, if for no other reason than the car could do it faster.
Personally, I would love to see autonomous cars see widespread use. They don't have to be perfect, just safer mile-for-mile than human drivers. (Which means that Teslas, with Musk's gobsmackingly stupid insistence on only using cameras, will never reach that threshold).