How Safe Are Self-Driving Cars After the First Fatal Accident?

  • Thread starter Thread starter Dr. Courtney
  • Start date Start date
  • Tags Tags
    Car Self
Click For Summary
A self-driving Uber vehicle struck and killed a pedestrian in Arizona, marking the first fatal incident involving autonomous cars. The event raises significant concerns about the safety and readiness of self-driving technology, especially given the limited number of vehicles in operation. Discussions highlight the potential for engineers to analyze the incident thoroughly, which could lead to improvements across all autonomous vehicles. There are debates about the legal implications of the accident, particularly regarding the accountability of the vehicle's operator and the technology itself. Ultimately, the incident underscores the complexities of integrating self-driving cars into public spaces and the necessity for rigorous safety standards.
  • #241
gleem said:
In 1921 first year this statistics may have been available there were 21 fatalities per 100 M vehicle miles vs 2016 with 1.18 fatalities per 100 M miles. Is it fair to compare AV in the early stages of development and public familiarity with vehicles that have been in the public domain for almost a century.

Uber presently has a rate 2.4x higher than that.

gleem said:
One manufacturer was suppose have had the policy that it was cheaper to settle law suites than repair (recall) the cars affected.

Yes, that was Ford, and they were charged criminally for that, and I was the one who brought it up.
 
Physics news on Phys.org
  • #242
As a pedestrian in Arizona you are more likely to be killed than any other state as a percentage of the population, 1.61 fatalities/100 K vs a national average of 0.81. 75% occur in the dark. The four most populous stated NY, CA, TX, FL have the most fatalities but AZ ranked 16 in pop comes in next in fatalities. Together these 5 states account for 43% of national pedestrian fatalities. PA the fifth most populous state only has 0.49 fatalities/100K. The most fatalities in a county in the US is in Maracopa Co. AZ which is where Tempe is. Uber could have reduced their exposure to untoward incidents by not testing their cars in AZ.

Govenors Highway Safety Report https://www.ghsa.org/sites/default/files/2018-02/pedestrians18.pdf
 
  • Like
Likes atyy
  • #243
gleem said:
Uber could have reduced their exposure to untoward incidents by not testing their cars in AZ.

And yet they chose not to.
 
  • #244
Over coarse on the other hand if they had carried out their tests in Hawaii this accident would have accounted for 50% of Hawaii fatalities this year instead of just 0.9%.in Arizona.
 
  • Like
Likes atyy
  • #245
berkeman said:
Sorry, I'm missing the distinction. What is the difference between self-driving and autopilot?
A self-driving car is a car that doesn't need a human driver. The Tesla autopilot controls the speed and steering in some conditions but cannot handle all traffic situations, hence the need for the human to pay attention the whole time. Tesla cars tell you that clearly before you can use the autopilot. If you don't pay attention as driver (!) it is your fault.
Vanadium 50 said:
Cars kill on average 12.5 people per billion miles driven. Uber self-driving cars kill on average 500 people per billion miles driven. You can reject that hypothesis that Uber cars are no less safe than human driven cars to >95% on the numbers we have.

For non-fatal accidents the rate is 6 per million miles driven. Waymo has 25 accidents in 5 million miles total - but if you look at the last three million miles, they have only had one: a rate that is 18x safer.

We can argue about small statistics, but the fact that in one case they are seeing a rate 40x higher and the other 18x lower says something. Uber had a product that is, to the best of our knowledge 720x more dangerous than Waymo's (and yes, 720 might be 500 or 1000), and because they wanted to beat Waymo in the market, tested it on an unsuspecting populace, and sure enough, they killed someone.
Numbers! Thanks.
Uber had 2 million miles driven by the end of 2017. At the same number of miles Waymo had 24 non-fatal accidents, or 12 times higher than the general rate. If they had the same higher risk for fatal accidents (150 per billion miles) they had a 30% chance of a fatal accident within these 2 million miles. Maybe they were just luckier. You know how problematic it is to draw statistical conclusions based on a single event or the absence of it.

I see two clear conclusions based on the numbers here:
* Waymo reduced its nonfatal accident rate over time, and it is now below the rate of human drivers
* The ratios "Uber fatal accident rate to human fatal accident rate in the first 2 million miles" and "Waymo nonfatal accident rate to human nonfatal accident rate in miles 2 million to 5 million" are significantly different. I'm not sure how much that comparison tells us.

Another thing to note: To demonstrate a lower rate of fatal accidents, Waymo will have to drive ~250 million miles, 50 times their current dataset, assuming no fatal accident.

All this assumes the driving profiles for the cars and for humans are not too different. If the cars drive more or less frequent in the dark, more or less frequent on the highway (lower accident rates but more accidents are fatal) or similar the numbers might change.
 
  • #246
Comparing apples to oranges.
According to Virginia tech, self driving cars crash rate is less than national average, per miles driven.
https://www.vtti.vt.edu/featured/?p=422
figure1.png

When compared to national crash rate estimates that control for unreported crashes (4.2 per million miles), the crash rates for the Self-Driving Car operating in autonomous mode when adjusted for crash severity (3.2 per million miles; Level 1 and Level 2 crashes) are lower. These findings reverse an initial assumption that the national crash rate (1.9 per million miles) would be lower than the Self-Driving Car crash rate in autonomous mode (8.7 per million miles) as they do not control for severity of crash or reporting requirements. Additionally, the observed crash rates in the SHRP 2 NDS, at all levels of severity, were higher than the Self-Driving Car rates. Estimated crash rates from SHRP 2 (age-adjusted) and Self-Driving Car are displayed in Figure 1.
 

Attachments

  • figure1.png
    figure1.png
    9.2 KB · Views: 630
  • Like
Likes mfb
  • #247
These rates are for Google's cars aka Waymo. As rough guideline: Level 1 are serious crashes (involving airbags, injuries, require towing, ...). Level 2 accidents have property damage only, level 3 means no/minimal damage.
If we combine the first two we get about 6 per million miles for human drivers and 3 per million miles for Google (but only with 1.3 million miles driven - so just 4 accidents in total).
 
  • #248
mfb said:
These rates are for Google's cars aka Waymo. As rough guideline: Level 1 are serious crashes (involving airbags, injuries, require towing, ...). Level 2 accidents have property damage only, level 3 means no/minimal damage.
If we combine the first two we get about 6 per million miles for human drivers and 3 per million miles for Google (but only with 1.3 million miles driven - so just 4 accidents in total).

If those are Waymo only, there is a huge difference in the approach Waymo is taking and the approach Tesla is taking.

For example, Tesla has no Lidar, something that Waymo considers a necessity.
 
  • #249
Interesting details but I think we all know the root issue here is human decision making on all sides.

I don't know the level of misclassification of objects but it must have been serious for the system to be disabled even for human sized object detection at close range. Too many false positives or even emergency braking when the impact danger is minor compared to possible reactions to the stop (small animal in the road with a semi behind you) can also be dangerous so it should have been a top priority fix for Uber. Waymo seems to have designed systems that handle most cases.

This accident was caused by humans on all sides. The car detected the illegally jaywalking intoxicated human (as an unknown object) in the dark 6 seconds before impact and would have triggered emergency braking almost two seconds before hitting her. Humans made the decision to disable automatic braking instead of fixing the classification problem first, humans decided to put a single 'safety' driver in the car with a impossible job and that human failed as expected from every study in human reaction time to monitor the need for emergency braking while not actively driving. The least responsible entity in the accident IMO is the car, computers and software.
 
Last edited:
  • Like
Likes berkeman
  • #250
The car detected the object and didn't slow down. What exactly did the car expect the object to do? Magically disappear? Turn around and leave the street? While the latter is a likely outcome, it is not guaranteed, and every responsible driver would prepare to slow down as soon as they see the object, and slow down long before an emergency braking is necessary.
NTL2009 said:
If those are Waymo only, there is a huge difference in the approach Waymo is taking and the approach Tesla is taking.

For example, Tesla has no Lidar, something that Waymo considers a necessity.
Tesla's approach is different in several aspects.
- No Lidar
- Build the hardware into a big number of cars
- Start with a driving assistant, but make it available to the public. Collect data from all the drivers using it.

Their software doesn't do as much as Waymo at the moment, although we don't know how much it could do. But they have tons of traffic data even for extremely rare traffic conditions - exactly these unknown unknowns where Uber and Waymo never encountered many of them.
 
  • #251
mfb said:
The car detected the object and didn't slow down. What exactly did the car expect the object to do? Magically disappear? Turn around and leave the street?
Classification of the object early is the key. The computer needs to know some sort of generic physical description of size, mass and direction to calculate the physics of motion for a possible future intercept path. The type of object is also important as bikes are very hard to judge movement orientation wise for computers and move unpredictably at slow speeds. Drivers don't always slow for every object (even people) at the edge of the road or for bikes in the official bike-lane on roads a few feet away from the side of a passing car.

I would ask the same thing of the unfortunate jaywalking victim here about expections. How can you not see an approaching cars headlights in darkness on a lonely street?
Human error on all sides, contributes to 94 percent of traffic fatalities, according to U.S. regulators.
 
  • #252
Read the report please. The object was at no point anywhere close to anything that could have been a bike lane. As soon as it was on the street it was an object that doesn't belong there: A potential risk.
nsaspook said:
I would ask the same thing of the unfortunate jaywalking victim here about expections. How can you not see an approaching cars headlights in darkness on a lonely street?
Well, it was clearly primarily the fault of the pedestrian - but it was still an accident the car could have avoided. The report gives some insight why the pedestrian might have missed the car:
The videos also show that the pedestrian, once visible, did not look in the direction of the vehicle until just before impact. [...] Toxicology test results for the pedestrian were positive for methamphetamine and marijuana.
 
  • Like
Likes nsaspook
  • #253
mfb said:
Read the report please. The object was at no point anywhere close to anything that could have been a bike lane. As soon as it was on the street it was an object that doesn't belong there: A potential risk.

The report lacks important details about first detection position, direction and speed to 1.3 seconds before impact.
Bicycle position at 4.7 seconds after first detection as a unknown object.
HWY18MH010-prelim-fig2.png

(This Uber self-driving system data playback from the fatal, March 18, 2018, crash of an Uber Technologies, Inc., test vehicle in Tempe, Arizona, shows when, at 1.3 seconds before impact, the system determined emergency braking was needed to mitigate a collision. The yellow bands depict meters ahead of the vehicle, the orange lines show the center of mapped travel lanes, the purple area shows the path of the vehicle and the green line depicts the center of that path.)

The report says the cars flawed systems still could have executed the correct procedure of an emergency brake sequence once the potential risk was a high probability of actual impact in time to at least mitigate the severity of the accident at 1.3 seconds and ~25 meters away from the victim at the current speed.
The report states data obtained from the self-driving system shows the system first registered radar and LIDAR observations of the pedestrian about six seconds before impact, when the vehicle was traveling 43 mph. As the vehicle and pedestrian paths converged, the self-driving system software classified the pedestrian as an unknown object, as a vehicle, and then as a bicycle with varying expectations of future travel path. At 1.3 seconds before impact, the self-driving system determined that emergency braking was needed to mitigate a collision. According to Uber emergency braking maneuvers are not enabled while the vehicle is under computer control to reduce the potential for erratic vehicle behavior. The vehicle operator is relied on to intervene and take action. The system is not designed to alert the operator.

NTSB News Release
 

Attachments

  • HWY18MH010-prelim-fig2.png
    HWY18MH010-prelim-fig2.png
    61.2 KB · Views: 596
  • #254
russ_watters said:
e. I can't fathom that someone would think it should be acceptable for Uber to be violating basic safe driving rules - and again, it disturbs me that I think I'm seeing people arguing that what Uber did is acceptable.

I agree.

They didn't have a safety driver. They had someone sitting in the driver's seat charged with other tasks. Furthermore, they used to have a second human operator for those tasks, but they went to one operator - and assigned her the non-safety tasks. That probably saved them a buck a mile. And now someone's dead.
 
  • Like
Likes russ_watters and nsaspook
  • #255
nsaspook said:
Bicycle position at 4.7 seconds after first detection as a unknown object.
Or 1.3 seconds after the first detection. An unknown object on the street that does not move notably in the direction of the traffic. I see the following options:
- it is nothing. I would expect this probability to be quite small. Slowing down will delay the car a bit but does not harm otherwise.
- it is a vehicle that broke down or stopped on the street (but not on the right side) for other reasons. This is a dangerous situation, a human might walk around, maybe even in a disoriented state if it was the result of an accident or similar. Slow down.
- it is some other stationary object like parts of a construction site. Slowing down might not be necessary, but it is not a bad idea either.
- it is a pedestrian or similar who has no business being on the street while a car approaches. Slow down.
- it is some slow object that does have a reason to be there (construction work or whatever). Slow down.
- something else? Is there any situation where slowing down could be a bad idea if a stationary or slow object is on the left side of the street?
 
  • Like
Likes nsaspook
  • #256
https://www.reuters.com/article/us-...driving-car-crash-police-report-idUSKBN1JI0LB
SAN FRANCISCO/WASHINGTON (Reuters) - The safety driver behind the wheel of a self-driving Uber car in Tempe, Arizona, was streaming a television show on her phone until about the time of a fatal crash, according to a police report that deemed the March 18 incident “entirely avoidable.”

A report by the Tempe Police Department said the driver, Rafaela Vasquez, repeatedly looked down and not at the road, glancing up a half second before the car hit Elaine Herzberg, 49, who was crossing the street at night.

The report said police concluded the crash, which has dealt Uber Technologies Inc a major setback in its efforts to develop self-driving cars, would have been “entirely avoidable” if Vasquez had been paying attention.

Vasquez could face charges of vehicular manslaughter, according to the report, which was released late on Thursday in response to a public records request.
According to a report last month by the National Transportation Safety Board, which is also investigating the crash, Vasquez told federal investigators she had been monitoring the self-driving interface in the car and that neither her personal nor business phones were in use until after the crash. That report showed Uber had disabled the emergency braking system in the Volvo, and Vasquez began braking less than a second after hitting Herzberg.

https://www.azcentral.com/story/new...crash-tempe-police-elaine-herzberg/724344002/
 
Last edited:
  • #258
nsaspook said:
https://www.theverge.com/2018/12/20...iving-car-return-public-road-pittsburgh-crash
Uber’s self-driving cars return to public roads for the first time since fatal crashLet's hope the auto emergency braking function is working this time.
Perhaps PA could give the car a written driving test to see if it feels like following basic safety procedures now?

Still no manslaughter charges have been filed for the "driver" or car in the incident earlier this year, and presumably both still have their licenses.
 
  • Like
Likes nitsuj
  • #259
Self-driving car hits self-driving robot Edit: Seems to be a PR stunt without accident.
The robot wars have begun.

A robot got lost following other robots and ended up on a street where a Tesla pushed it to the side. The driver said he was aware of the robot but didn't brake - probably because no human was in danger. The robot fell over and got damaged.

While it is interesting to learn how the robot made it to the street: The car should have avoided the robot, and it will be interesting to see Tesla's reaction to it.
 
Last edited:
  • Like
Likes nsaspook
  • #260
mfb said:
Self-driving car hits self-driving robot
The robot wars have begun.

A robot got lost following other robots and ended up on a street where a Tesla pushed it to the side. The driver said he was aware of the robot but didn't brake - probably because no human was in danger. The robot fell over and got damaged.

While it is interesting to learn how the robot made it to the street: The car should have avoided the robot, and it will be interesting to see Tesla's reaction to it.

looks like a robot suicide.. lol
 
  • #261
russ_watters said:
Perhaps PA could give the car a written driving test to see if it feels like following basic safety procedures now?

Still no manslaughter charges have been filed for the "driver" or car in the incident earlier this year, and presumably both still have their licenses.

Do you think that manslaughter charges are due and for who?
 
  • #262
seazal said:
looks like a robot suicide.. lol
If you look at the screen shot the robot is grinning a suspicious amount for a solo stroll on the road; i suspect it was under the influence and wandered.
 
  • #263
Say, what kind of sensors do these self driving cars used in navigations? Is it Ultrasonic? Heartbeat or Exhaust sensors? Shapes detection? I'd like to understand why it missed the robot.

What if you walk like a robot or wear Darth Vader costumes. Maybe when I see any self driving cars in the street, I should just run as fast as I can to the sides or look for any cover.
 
  • #264
nitsuj said:
Do you think that manslaughter charges are due and for who?
Yes, certainly for the driver - vehicular homicide/manslaughter due to distracted driving is a fairly standard/common charge.

And if the prosecutors are feeling frisky, the supervisor(s) of the test as well. This one is a lot less likely and justified now that we know the driver was watching a video at the time of the collision. Still, the driver was evidently explicitly instructed to violate safe driving principles/laws while safety features on the car were purposely disabled. This set of conditions was created by the test supervisor(s).
 
  • Like
Likes nitsuj and nsaspook
  • #265
seazal said:
Say, what kind of sensors do these self driving cars used in navigations? Is it Ultrasonic?
Google yields an awful lot of information on this if you search. The short answer is some combination of one or more of video, lidar, radar and ultrasonic.
https://www.sensorsmag.com/components/three-sensor-types-drive-autonomous-vehicles
 
  • #266
russ_watters said:
Google yields an awful lot of information on this if you search. The short answer is some combination of one or more of video, lidar, radar and ultrasonic.
https://www.sensorsmag.com/components/three-sensor-types-drive-autonomous-vehicles

I mean, why did it miss the walking robot? What sensors can miss it?
 
  • #268
russ_watters said:
Yes, certainly for the driver

Legally, who was the driver? If I were the person in the driver's seat, I'd be arguing "Hey, I wasn't the driver. My job was to sit up here and fill out forms."
 
  • #269
Vanadium 50 said:
Legally, who was the driver? If I were the person in the driver's seat, I'd be arguing "Hey, I wasn't the driver. My job was to sit up here and fill out forms."
Sure, but unless the law was rewritten (doubt it), the person with the steering wheel in front of them is a "driver" regardless of what the company tells them their job is.
 
  • #270
russ_watters said:
Yes, certainly for the driver - vehicular homicide/manslaughter due to distracted driving is a fairly standard/common charge.

And if the prosecutors are feeling frisky, the supervisor(s) of the test as well. This one is a lot less likely and justified now that we know the driver was watching a video at the time of the collision. Still, the driver was evidently explicitly instructed to violate safe driving principles/laws while safety features on the car were purposely disabled. This set of conditions was created by the test supervisor(s).

I lean more towards the person, in this case Uber i guess, that instructed to violate safe driving principles. presumably the driver assumed that is because the car can drive itself and not hit things on the road. which turned out to not be the case.

I guess the question is would a reasonable person follow those instructions.

The test set up doesn't seem to be done with malicious intent, so I see it as "eing done by Uber", the supervisor is just an "agent" of uber. Same as if I have authority to bind my employer to a contract. My employer cannot back out because they don't like it.
 

Similar threads

  • · Replies 123 ·
5
Replies
123
Views
12K
  • · Replies 19 ·
Replies
19
Views
12K
  • · Replies 1 ·
Replies
1
Views
10K
  • · Replies 13 ·
Replies
13
Views
4K