That’s only because they don’t drive that fast, and they just dump most problems to a human driver. If a self-driving car stopped in the middle of an intersection and a human driver hit it, the blame would legally fall on the human despite the fact that the self-driving car caused the issue.
Liability of an accident doesn’t factor into those statistics. They include all accidents regardless of blame.
And you act like that exact scenario hasn’t happened with human drivers. In Arizona alone, 3.5 people die every day in traffic incidents. Given the number of dumb fuck road ragers brake checking other drivers on the road, stopping in the middle of an intersection and getting hit probably happens at least once a day, probably more.
Yet even with all those fatalities, and other accidents in general, the autonomous systems still have fewer incidents per mile driven.
These cars were banned from CA roads because they had too many safety issues. That’s why they’re in AZ. One company didn’t address the issue so the CA DMV threatened to pull their registration completely.
That’s an apples to oranges comparison. Self driving cars aren’t driving on the same roads and in the same conditions. Maybe they’re better, but that hasn’t really been tested/evaluated.
They already are, the media just reports on every one of these crashes. Even just reporting on each human fatality daily would put things closer to perspective even with every autonomous accident being reported as if it were the end times.
Well at least no one died this time
Per mile driven, all of these autonomous systems are statistically better than humans at driving.
It’s mostly because humans are dogshit at driving, but you are a lot safer using these systems than not, despite media reporting.
That’s only because they don’t drive that fast, and they just dump most problems to a human driver. If a self-driving car stopped in the middle of an intersection and a human driver hit it, the blame would legally fall on the human despite the fact that the self-driving car caused the issue.
Liability of an accident doesn’t factor into those statistics. They include all accidents regardless of blame.
And you act like that exact scenario hasn’t happened with human drivers. In Arizona alone, 3.5 people die every day in traffic incidents. Given the number of dumb fuck road ragers brake checking other drivers on the road, stopping in the middle of an intersection and getting hit probably happens at least once a day, probably more.
Yet even with all those fatalities, and other accidents in general, the autonomous systems still have fewer incidents per mile driven.
These cars were banned from CA roads because they had too many safety issues. That’s why they’re in AZ. One company didn’t address the issue so the CA DMV threatened to pull their registration completely.
https://www.npr.org/2023/10/24/1208287502/california-orders-cruise-driverless-cars-off-the-roads-because-of-safety-concern
Why else do you think Google is testing this thousands of miles from their headquarters?
That’s an apples to oranges comparison. Self driving cars aren’t driving on the same roads and in the same conditions. Maybe they’re better, but that hasn’t really been tested/evaluated.
Sir, your whole family died in a waymo accident.
First of all, statistically, the chance was bigger that they didn’t die. And now leave the billionaires alone.
3.5 died on Arizona roads yesterday and 3.5 more will die today - think the robots will be safer than us someday?
They already are, the media just reports on every one of these crashes. Even just reporting on each human fatality daily would put things closer to perspective even with every autonomous accident being reported as if it were the end times.
Well, considering way more than 3.5 people crash and die in Arizona every day… I’d say yes. They will become safer about a year ago it sounds.