A Tesla Model S exited a freeway, ran a red light and slammed into a Honda Civic in 2019 in Gardena
A bad autopilot is even better than a lot of human drivers. It is not the question that an autopilot is perfect, although accidents do happen, they are much less frequent than with human drivers.
The driver is to blame for this accident, because having an autopilot does not mean that he can go to sleep on the trip.
Yes. Unambiguously yes. The current laws on self driving cars is very clear that the driver must always be looking at the road, must always be assessing whether the car is driving safely, and must be ready to take control of the car at any time.
The law says yes. However I think the law is wrong. I think the law should put the company that designed the system at fault. They are creating system that is encouraging people to not focus on the road and if they are doing that they should be responsible for what the car does.
I also think that Tesla is super irresponsible by living in the “uncanny valley” of self-driving where they advertise like it is full self driving (Autopilot) but then say the driver needs to pay attention. I think this should be illegal because it is well-known the humans can’t reliably pay attention to mundane tasks. I think Google did this right. They did an early test of self-driving with their employees, noticed that a lot of them weren’t paying attention and pulled the plug. Then they stopped testing until they had full self driving.
Both are responsible
Tesla should be held liable. Their autopilot mode is terrifyingly bad. One of my best friends owns a Tesla Model 3 and showed me the autopilot mode-- the whole time he was saying "just wait, it’ll fuck up somehow" and sure enough it inexplicably tried to take a right exit off the highway by jamming the brakes and veering to the right until my friend took manual control over it again.
I honestly can’t believe Tesla autopilot mode is allowed on roads. It’s so clearly still technology in its infancy and not at all ready for real-world application. The company misleads Tesla owners into a false sense of safety and has hoards of lawyers who’ve quite clearly done everything they can to protect Tesla from any liability. Lawmakers won’t adapt because the whole system is reliant on not stifling the almighty growth of corporations like Tesla.
Doesn’t autopilot requires de driver to pay attention and have hands at the wheel at all times? I’d guess they could be held liable if they could prove the driver tried to correct the car but faulty software didn’t allow him/her to take control back.🤷
This is like giving a kid a cake to hold and getting mad at them when they eat it. We know that humans can’t pay attention to mundane tasks. Maybe a few people can all of the time, and most people can some of the time, but as a rule it just doesn’t happen. It is utterly irresponsible to pretend like this isn’t true and ship diver-assist systems that are good enough that people stop paying attention.
I think [Autonomy Levels] 2-4 should be outright illegal. They are basically guaranteed to have people not paying full attention and result in crashes. Level 5 should be legal but the manufacture should be responsible for any accidents, illegal actions or other malfunctions.
Too many large corporations (Facebook) pretending not knowing the risks !
btw : Happy cake day 🥳 !
Sounds like it should be renamed after one of the most dangerous jobs in the world - “Test Pilot”
I think it comes down to the rate of autopilot fuck ups. If it’s close to or worse than human drivers Tesla should definitely be held to account. Or if there are traffic scenarios where autopilot is shown to commonly put people in danger I think that also qualifies. Of course getting objective/non-tampered data is the hard part…
While the driver is certainly to blame for not paying attention, one must ask why he even blindly trusted on the system in the first place.
Tesla could still be found guilty for false advertising.
There’s an ongoing review to check if Tesla is violating the DMV regulation that bars companies from marketing their cars as autonomous when they are not.
What if there was no drivers ? Tesla has to pay.
Tesla will tell you to pay attention, the driver is always at fault. Tesla’s system is law-abiding and the law says the driver of the car is responsible, even if they technically aren’t driving.
Subscribe to see more stories about technology on your homepage