Tesla driver charged with vehicular manslaughter in fatal Autopilot crash
Category: News & Politics
Via: perrie-halpern • 2 years ago • 24 commentsBy: The Associated Press
Kevin George Aziz Riad, 27, is likely the first motorist to be accused of a felony in the United States after a fatal accident while using a partially automated driving system.
According to authorities, his 2016 Tesla Model S collided with a Honda Civic in Gardena, California, on Dec. 29, 2019, killing Gilberto Alcazar Lopez and Maria Guadalupe Nieves-Lopez.
A civil case which names Riad and Tesla Motors Inc as defendants alleges that the car was traveling at an "excessively high rate of speed" when it crashed.
Riad and a woman in the Tesla were hospitalized with non-life-threatening injuries.
Riad, a limousine service driver, was charged in October with two counts of vehicular manslaughter with gross negligence, but they came to light only last week, according to The Associated Press. His attorney and Tesla did not immediately respond to a request for comment on Wednesday.
Riad has pleaded not guilty to the charges and is out on bail while the case is pending. His preliminary hearing is scheduled for Feb. 23.
The criminal charges aren't the first involving an automated driving system, but they are the first to involve a widely-used driver technology. Authorities in Arizona filed a charge of negligent homicide in 2020 against a driver Uber had hired to take part in the testing of a fully autonomous vehicle on public roads. The Uber vehicle, an SUV with the human backup driver on board, struck and killed a pedestrian.
By contrast, Autopilot and other driver-assist systems are widely used on roads across the world. An estimated 765,000 Tesla vehicles are equipped with it in the United States alone.
Tags
Who is online
51 visitors
I was going to ask why the autopilot was not being charged, but then I thought of the analogy of a man who owns pit bulls that kill a person. However, even though the dogs' owner is charged with an offence, the dogs are probably put down as well. So I guess the Tesla might face being demolished due to that precedent.
There is a lot of money going to be riding on this case billions and billions of dollars, the courts need to weight in on liability issues before the insurance companies can start the actuarial process... I suspect it will go all the way to the Supreme Court before it is all over....
A brand new tech with all new issues to consider...
Planes on autopilot still need someone monitoring the systems under current regulations. The same logic should apply to auto piloted ground transportation.
I know this is going to sound terrible butt...that has to be the smartest thing I have ever read from you.
Only a fool would let a high level sport car have complete control.
Only a foolish company would sell a flawed car.
Semi-auto functionality is an accident waiting to happen. The driver could easily get used to the vehicle ‘knowing what to do’ and wind up with neither auto nor manual attention. That may not be what happened here but it certainly is a concern in general.
In contrast, having safeguards that catch human failures (e.g. running off the road, failing to break, etc.) are good since the worst case scenario is avoiding an accident.
What is not clear is how a Tesla would fail to brake here. Does it not automatically brake to avoid an accident even in manual control mode?
It's supposed to, this is the same problem the UBER auto driver had when it killed a pedestrian several years back.... The tech isn't mature/capable enough to handle every possible combination of real life is what they determined and it may be several decades until it is... The problem is it works at slow speeds but has sensory gaps at highway speeds... The systems are just not computationally fast enough yet to handle full speed real life.... But of course they are not sufficiently warning people of this in the drive to get the tech out there...
I'm guessing speed was a factor and braking was applied too late to be effective.
I'm still amazed we have cars on the road that go well over safe driving speeds. I'm not talking about posted speed limits, I'm talking about unmodified cars going over 100 mpr. It's makes no sense to me.
Actuarial statistics... cars can go that fast cause they don't kill enough people to make it a loss against profitability... That's the process in play here, the arguments are going to be that it's an aberration that doesn't happen enough to warrant fixing.... The old we can build a car that won't kill anyone, but at such cost no one can afford to buy it....
Apparently the owner was only paying attention to how far down the accelerator was mashed.
Actually, the Tesla autopilot controls the accelerator...
One would think that there must be a limit to how high a speed limit could be set on autopilot. If Tesla allows the autopilot to be set faster than the speed limit of the road you’re on then that is a clear liability on their part.
Probably.
I wouldn't doubt that would be reply to my question, but I don't buy the reality. Yes, it would cost to retool the process, but the actual cost of the car without the added cost of retooling would be the same.
The cost of the different materials types needed to make the car destruction proof would price it out of the market... the basic design itself would change little.... The details would change a lot though.... The problem with the automobile and it's death/injury rate is that it was designed and produced in an age where they didn't do cost/benefit analysis... the car was viewed as just another way to do various jobs easier than the previous ways and as with the old ways some negative results had to be accepted...
As the tech improved that attitude didn't though cause it is what was accepted as part of the risks to using the tech... Same as ship sinking's, and airplane crashes...
With ship sinking's and airplane crashes came another situation, mass injury/death, which is not something that the industry can consider a loss leader cost to be borne by the manufacturer's legal exemption... Today, we have ships properly operated that should never sink.. we have airplanes where all the pilot has to do is get it to the end of the runway, the airplane is entirely capable of doing the rest all by itself.... (with much much lower accident rates)
The auto-driving or piloting car is not going to get the same laissez-faire treatment as the original automobile did... it's going to have to have the same standard of testing before it is fully accepted in this litigious society...
It's new tech, it's going to have to prove itself in everyday life, this is just the start of a long long road...
Just my personal opinion but I disagree that Tesla is at all responsible. As a driver it is solely your responsibility to pay attention while driving, be in control and follow local traffic laws/safe practices
Tesla would be responsible, IMO, if the vehicle malfunctioned. That is, if the vehicle failed to properly perform its stated functions. Ultimately I think your view is correct, the driver needs to be ultimately responsible and actively engaged to take control when necessary.
Correct.
I've driven hundreds of thousands of miles, many on cruise control systems that weren't intelligent.
This is absolutely no different than not tapping the brake pedal when you feel the speed control
on any other domestic or imported model, smart or otherwise, isn't doing as you, the driver intended.
To be speeding through a red light is either intentional
or the result of negligence of not being aware of what the car was doing.
This is more about the car being a Tesla than anything else in my opinion.
Worst case scenario, a driver realizes the car is accelerating and he can't stop it.
Options for a trained chauffer?
Riad t=boned the victim's car at a high rate of speed at an intersection.
Riad effed up.
That is my opinion on what it comes down to, the driver is responsible for the operation of any motor vehicle under his direct control at all times is the way the legal theory reads... It is incumbent on the driver of the vehicle to make the argument and establish that the vehicle malfunctioned....
The answer to that question will determine the liability, and hence who pays the price...
Two things here are driving this case, Tesla, the deep pockets, and Autopilot, the potentially unsound technology... This is a lawsuit against the technology...
Essentially, it's a lawsuit against Elon Musk aka Tesla and change.
yep, one I'm sure they knew was eventually going to come...
I'm not familiar with theses systems but the article said,
A little research is due to determine what the partially auto system is vs the fully automated driving system.
ADAS Different Levels Explained - Pitstop (getpitstop.com)
Yeah that's the framework for the tech descriptions at this point... Level 5 doesn't exist yet and level 4 is only being used as test beds.. Level 3 is the highest available on any car in production and for sale...
So yes the issue with Riad I believe will become paramount as long as Tesla can prove no malfunction of the system...
I am going to have to return to the laws of the sea.
No matter how automated a Navy ship is
( and modern freight super tankers and super freighters are the very essence of modern technology )
The Captain/Commander/Skipper is ALWAYS RESPONSIBLE FOR HIS SHIP
AS EVIDENCED BY TOO MANY STORYS OF CAPTAINS WHO WENT DOWN WITH THE SHIP
or lived, were fired, demoted or jailed.
Whatever you are driving, sailing or flying, you are responsible for.
I have found that it has influenced the high number of military air accidents I investigated.
Better to mitigate the damage on the ground and die a hero than admit an error or carelessness and get fired.
IMHO
The engineer of a train on auto pilot is still the engineer.
The driver of a car on autopilot is still the driver with the same responsibilities and expectations of any automobile operator.