How Safe Are Self-Driving Cars After the First Fatal Accident?

  • Thread starter Thread starter Dr. Courtney
  • Start date Start date
  • Tags Tags
    Car Self
AI Thread Summary
A self-driving Uber vehicle struck and killed a pedestrian in Arizona, marking the first fatal incident involving autonomous cars. The event raises significant concerns about the safety and readiness of self-driving technology, especially given the limited number of vehicles in operation. Discussions highlight the potential for engineers to analyze the incident thoroughly, which could lead to improvements across all autonomous vehicles. There are debates about the legal implications of the accident, particularly regarding the accountability of the vehicle's operator and the technology itself. Ultimately, the incident underscores the complexities of integrating self-driving cars into public spaces and the necessity for rigorous safety standards.
  • #251
mfb said:
The car detected the object and didn't slow down. What exactly did the car expect the object to do? Magically disappear? Turn around and leave the street?
Classification of the object early is the key. The computer needs to know some sort of generic physical description of size, mass and direction to calculate the physics of motion for a possible future intercept path. The type of object is also important as bikes are very hard to judge movement orientation wise for computers and move unpredictably at slow speeds. Drivers don't always slow for every object (even people) at the edge of the road or for bikes in the official bike-lane on roads a few feet away from the side of a passing car.

I would ask the same thing of the unfortunate jaywalking victim here about expections. How can you not see an approaching cars headlights in darkness on a lonely street?
Human error on all sides, contributes to 94 percent of traffic fatalities, according to U.S. regulators.
 
Physics news on Phys.org
  • #252
Read the report please. The object was at no point anywhere close to anything that could have been a bike lane. As soon as it was on the street it was an object that doesn't belong there: A potential risk.
nsaspook said:
I would ask the same thing of the unfortunate jaywalking victim here about expections. How can you not see an approaching cars headlights in darkness on a lonely street?
Well, it was clearly primarily the fault of the pedestrian - but it was still an accident the car could have avoided. The report gives some insight why the pedestrian might have missed the car:
The videos also show that the pedestrian, once visible, did not look in the direction of the vehicle until just before impact. [...] Toxicology test results for the pedestrian were positive for methamphetamine and marijuana.
 
  • Like
Likes nsaspook
  • #253
mfb said:
Read the report please. The object was at no point anywhere close to anything that could have been a bike lane. As soon as it was on the street it was an object that doesn't belong there: A potential risk.

The report lacks important details about first detection position, direction and speed to 1.3 seconds before impact.
Bicycle position at 4.7 seconds after first detection as a unknown object.
HWY18MH010-prelim-fig2.png

(This Uber self-driving system data playback from the fatal, March 18, 2018, crash of an Uber Technologies, Inc., test vehicle in Tempe, Arizona, shows when, at 1.3 seconds before impact, the system determined emergency braking was needed to mitigate a collision. The yellow bands depict meters ahead of the vehicle, the orange lines show the center of mapped travel lanes, the purple area shows the path of the vehicle and the green line depicts the center of that path.)

The report says the cars flawed systems still could have executed the correct procedure of an emergency brake sequence once the potential risk was a high probability of actual impact in time to at least mitigate the severity of the accident at 1.3 seconds and ~25 meters away from the victim at the current speed.
The report states data obtained from the self-driving system shows the system first registered radar and LIDAR observations of the pedestrian about six seconds before impact, when the vehicle was traveling 43 mph. As the vehicle and pedestrian paths converged, the self-driving system software classified the pedestrian as an unknown object, as a vehicle, and then as a bicycle with varying expectations of future travel path. At 1.3 seconds before impact, the self-driving system determined that emergency braking was needed to mitigate a collision. According to Uber emergency braking maneuvers are not enabled while the vehicle is under computer control to reduce the potential for erratic vehicle behavior. The vehicle operator is relied on to intervene and take action. The system is not designed to alert the operator.

NTSB News Release
 

Attachments

  • HWY18MH010-prelim-fig2.png
    HWY18MH010-prelim-fig2.png
    61.2 KB · Views: 594
  • #254
russ_watters said:
e. I can't fathom that someone would think it should be acceptable for Uber to be violating basic safe driving rules - and again, it disturbs me that I think I'm seeing people arguing that what Uber did is acceptable.

I agree.

They didn't have a safety driver. They had someone sitting in the driver's seat charged with other tasks. Furthermore, they used to have a second human operator for those tasks, but they went to one operator - and assigned her the non-safety tasks. That probably saved them a buck a mile. And now someone's dead.
 
  • Like
Likes russ_watters and nsaspook
  • #255
nsaspook said:
Bicycle position at 4.7 seconds after first detection as a unknown object.
Or 1.3 seconds after the first detection. An unknown object on the street that does not move notably in the direction of the traffic. I see the following options:
- it is nothing. I would expect this probability to be quite small. Slowing down will delay the car a bit but does not harm otherwise.
- it is a vehicle that broke down or stopped on the street (but not on the right side) for other reasons. This is a dangerous situation, a human might walk around, maybe even in a disoriented state if it was the result of an accident or similar. Slow down.
- it is some other stationary object like parts of a construction site. Slowing down might not be necessary, but it is not a bad idea either.
- it is a pedestrian or similar who has no business being on the street while a car approaches. Slow down.
- it is some slow object that does have a reason to be there (construction work or whatever). Slow down.
- something else? Is there any situation where slowing down could be a bad idea if a stationary or slow object is on the left side of the street?
 
  • Like
Likes nsaspook
  • #256
https://www.reuters.com/article/us-...driving-car-crash-police-report-idUSKBN1JI0LB
SAN FRANCISCO/WASHINGTON (Reuters) - The safety driver behind the wheel of a self-driving Uber car in Tempe, Arizona, was streaming a television show on her phone until about the time of a fatal crash, according to a police report that deemed the March 18 incident “entirely avoidable.”

A report by the Tempe Police Department said the driver, Rafaela Vasquez, repeatedly looked down and not at the road, glancing up a half second before the car hit Elaine Herzberg, 49, who was crossing the street at night.

The report said police concluded the crash, which has dealt Uber Technologies Inc a major setback in its efforts to develop self-driving cars, would have been “entirely avoidable” if Vasquez had been paying attention.

Vasquez could face charges of vehicular manslaughter, according to the report, which was released late on Thursday in response to a public records request.
According to a report last month by the National Transportation Safety Board, which is also investigating the crash, Vasquez told federal investigators she had been monitoring the self-driving interface in the car and that neither her personal nor business phones were in use until after the crash. That report showed Uber had disabled the emergency braking system in the Volvo, and Vasquez began braking less than a second after hitting Herzberg.

https://www.azcentral.com/story/new...crash-tempe-police-elaine-herzberg/724344002/
 
Last edited:
  • #258
nsaspook said:
https://www.theverge.com/2018/12/20...iving-car-return-public-road-pittsburgh-crash
Uber’s self-driving cars return to public roads for the first time since fatal crashLet's hope the auto emergency braking function is working this time.
Perhaps PA could give the car a written driving test to see if it feels like following basic safety procedures now?

Still no manslaughter charges have been filed for the "driver" or car in the incident earlier this year, and presumably both still have their licenses.
 
  • Like
Likes nitsuj
  • #259
Self-driving car hits self-driving robot Edit: Seems to be a PR stunt without accident.
The robot wars have begun.

A robot got lost following other robots and ended up on a street where a Tesla pushed it to the side. The driver said he was aware of the robot but didn't brake - probably because no human was in danger. The robot fell over and got damaged.

While it is interesting to learn how the robot made it to the street: The car should have avoided the robot, and it will be interesting to see Tesla's reaction to it.
 
Last edited:
  • Like
Likes nsaspook
  • #260
mfb said:
Self-driving car hits self-driving robot
The robot wars have begun.

A robot got lost following other robots and ended up on a street where a Tesla pushed it to the side. The driver said he was aware of the robot but didn't brake - probably because no human was in danger. The robot fell over and got damaged.

While it is interesting to learn how the robot made it to the street: The car should have avoided the robot, and it will be interesting to see Tesla's reaction to it.

looks like a robot suicide.. lol
 
  • #261
russ_watters said:
Perhaps PA could give the car a written driving test to see if it feels like following basic safety procedures now?

Still no manslaughter charges have been filed for the "driver" or car in the incident earlier this year, and presumably both still have their licenses.

Do you think that manslaughter charges are due and for who?
 
  • #262
seazal said:
looks like a robot suicide.. lol
If you look at the screen shot the robot is grinning a suspicious amount for a solo stroll on the road; i suspect it was under the influence and wandered.
 
  • #263
Say, what kind of sensors do these self driving cars used in navigations? Is it Ultrasonic? Heartbeat or Exhaust sensors? Shapes detection? I'd like to understand why it missed the robot.

What if you walk like a robot or wear Darth Vader costumes. Maybe when I see any self driving cars in the street, I should just run as fast as I can to the sides or look for any cover.
 
  • #264
nitsuj said:
Do you think that manslaughter charges are due and for who?
Yes, certainly for the driver - vehicular homicide/manslaughter due to distracted driving is a fairly standard/common charge.

And if the prosecutors are feeling frisky, the supervisor(s) of the test as well. This one is a lot less likely and justified now that we know the driver was watching a video at the time of the collision. Still, the driver was evidently explicitly instructed to violate safe driving principles/laws while safety features on the car were purposely disabled. This set of conditions was created by the test supervisor(s).
 
  • Like
Likes nitsuj and nsaspook
  • #265
seazal said:
Say, what kind of sensors do these self driving cars used in navigations? Is it Ultrasonic?
Google yields an awful lot of information on this if you search. The short answer is some combination of one or more of video, lidar, radar and ultrasonic.
https://www.sensorsmag.com/components/three-sensor-types-drive-autonomous-vehicles
 
  • #266
russ_watters said:
Google yields an awful lot of information on this if you search. The short answer is some combination of one or more of video, lidar, radar and ultrasonic.
https://www.sensorsmag.com/components/three-sensor-types-drive-autonomous-vehicles

I mean, why did it miss the walking robot? What sensors can miss it?
 
  • #268
russ_watters said:
Yes, certainly for the driver

Legally, who was the driver? If I were the person in the driver's seat, I'd be arguing "Hey, I wasn't the driver. My job was to sit up here and fill out forms."
 
  • #269
Vanadium 50 said:
Legally, who was the driver? If I were the person in the driver's seat, I'd be arguing "Hey, I wasn't the driver. My job was to sit up here and fill out forms."
Sure, but unless the law was rewritten (doubt it), the person with the steering wheel in front of them is a "driver" regardless of what the company tells them their job is.
 
  • #270
russ_watters said:
Yes, certainly for the driver - vehicular homicide/manslaughter due to distracted driving is a fairly standard/common charge.

And if the prosecutors are feeling frisky, the supervisor(s) of the test as well. This one is a lot less likely and justified now that we know the driver was watching a video at the time of the collision. Still, the driver was evidently explicitly instructed to violate safe driving principles/laws while safety features on the car were purposely disabled. This set of conditions was created by the test supervisor(s).

I lean more towards the person, in this case Uber i guess, that instructed to violate safe driving principles. presumably the driver assumed that is because the car can drive itself and not hit things on the road. which turned out to not be the case.

I guess the question is would a reasonable person follow those instructions.

The test set up doesn't seem to be done with malicious intent, so I see it as "eing done by Uber", the supervisor is just an "agent" of uber. Same as if I have authority to bind my employer to a contract. My employer cannot back out because they don't like it.
 
  • #271
I think that people are forgetting the first few pages where it seems pretty clear that the pedestrian stepped into the path of the car at night, nowhere near a crosswalk and at the last second in the darkest section of the road - such that it would have been difficult to avoid even if the driver had been paying full attention. If this had happened to me, I have a dash cam that would have shown it as unavoidable just as the forward-looking camera did on the Uber car. But, because there was an internal camera pointed at the driver showing that he wasn't paying attention, suddenly he's guilty of manslaughter? Yes, he should have been paying attention but I don't think that it would have made any difference because it's doubtful that he could have reacted fast enough anyway.
OmCheeto said:
About an hour ago, the Tempe police department released a video of the accident:


Correction on the last second part. Based on Om's video link, she was crossing from the left but was in the section between the lights. I still think that this is far more the fault of the pedestrian and would be a coin flip as to whether it would have been avoidable or not.
 
Last edited:
  • #272
nitsuj said:
presumably the driver assumed that is because the car can drive itself and not hit things on the road. which turned out to not be the case.
I would hope the driver was instructed that the collision avoidance system had been turned off, but if not that would shift culpability to Uber management.
 
  • #273
Borg said:
I think that people are forgetting the first few pages where it seems pretty clear that the pedestrian stepped into the path of the car at night, nowhere near a crosswalk and at the last second in the darkest section of the road - such that it would have been difficult to avoid even if the driver had been paying full attention. If this had happened to me, I have a dash cam that would have shown it as unavoidable just as the forward-looking camera did on the Uber car. But, because there was an internal camera pointed at the driver showing that he wasn't paying attention, suddenly he's guilty of manslaughter?
This thread is getting old and long and stale, but we've discussed before that according to the preliminary ntsb report, the car detected and tracked the pedestrian for six seconds before impact, meaning the pedestrian was in the clear, and the driver would have had an unobstructed view for about 330 feet - a football field. The lighting is difficult to prove, but a number of people shot stills and video to debunk the idea that it was very dark. That is tough to know though because of varying camera settings. But sure, if it can be demonstrated - and it should be easy for police or the ntsb to do so - that darkness precluded seeing the pedestrian until too late, that would reduce the driver's culpability.

I'm not sure if the pedestrian's jaywalking matters legally or not. There are a number of sites that discuss the civil side (yes a jaywalker can successfully sue a driver who hits them):
https://www.injuryattorneyofdallas.com/jaywalker-hit-car-whos-fault/
 
Last edited:
  • #274
russ_watters said:
Sure, but unless the law was rewritten (doubt it), the person with the steering wheel in front of them is a "driver" regardless of what the company tells them their job is.

I looked it up. The legal term is "operator" and the legal definition is . "Operator" means a person who drives a motor vehicle on a highway, who is in actual physical control of a motor vehicle on a highway or who is exercising control over or steering a vehicle being towed by a motor vehicle.

I think the phrase "actual physical control" is subject to lawyering, and I think Uber should be terrified of the question. If the answer is "the check driver is in actual physical control", then they need to explain why they added job tasks that would require her attention to be elsewhere. This makes them negligent. If the answer is "no, the check driver is not in physical control", who or what is? And the only answer to that is "the car's programming", which is again, Uber's responsibility.
 
  • Like
Likes nitsuj, Bystander, mfb and 2 others
  • #275
As a PS, I expect a lot of legal wrangling over the difference between "physical control" and just plain old "control".
 
  • #276
Vanadium 50 said:
As a PS, I expect a lot of legal wrangling over the difference between "physical control" and just plain old "control".

Wouldn't the law certainly say the person in the drivers seat was driver? What's ambiguous is if employee, employer or person who set up the test with anti collision disabled is to blame.
 
  • #277
Borg said:
Yes, he should have been paying attention but I don't think that it would have made any difference because it's doubtful that he could have reacted fast enough anyway.
How can something that didn't happen be used as justification for something that did?

Peeps can be charged for intent alone. I see the logic in your comment the same as saying "Yes they should not have been conspiring a terrorist attack, but they didn't succeed so it doesn't matter anyway."
 
  • #278
nitsuj said:
Wouldn't the law certainly say the person in the drivers seat was driver? What's ambiguous is if employee, employer or person who set up the test with anti collision disabled is to blame.
Where is the driver seat in the law (there are cars which have them on the other side) and who thought about defining that explicitly at a time driverless cars didn't exist?
nitsuj said:
How can something that didn't happen be used as justification for something that did?
"Failing to do X" is typically not illegal if doing X wouldn't have made a difference. If you don't give CPR to a person who is clearly dead it is okay - giving CPR wouldn't make a difference.
 
  • #279
nitsuj said:
How can something that didn't happen be used as justification for something that did?

Peeps can be charged for intent alone. I see the logic in your comment the same as saying "Yes they should not have been conspiring a terrorist attack, but they didn't succeed so it doesn't matter anyway."

I think the slight difference here is the person hit was breaking a traffic law, jaywalking at a location the city explicitly put a "Do No Cross" sign. I think that's usually a mitigating factor in the blame calculation even if the driver was distracted but otherwise a reasonable (not impaired) driver. An offsetting 'foul' type of situation for a criminal charge or civil liability due to Contributory or Comparative Negligence.

It's not an excuse for UBER disconnecting the emergency auto-braking system because of operational 'false alarm' problems.
 
  • Like
Likes nitsuj
  • #280
mfb said:
Where is the driver seat in the law (there are cars which have them on the other side) and who thought about defining that explicitly at a time driverless cars didn't exist?"Failing to do X" is typically not illegal if doing X wouldn't have made a difference. If you don't give CPR to a person who is clearly dead it is okay - giving CPR wouldn't make a difference.

It doesn't have to be in the law to discuss how the law interprets itself. I don't find calling the person in the drivers seat the driver difficult. Regardless of what technology was assisting the person in driving; a conscious task in sum. To your point the law isn't written for driverless cars. I find it silly to debate who was the driver in this situation.

The person was breaking the law, only reason it's against the law is because risk of injury and damage. It's okay for me not to give cpr due to not knowing how to give cpr.

ignoring duty of care is against the law. not paying attention while driving a car is ignoring duty of care; because the activity can result in injury / damage.

I'm starting to think uber assumed duty of care in operating the car as a good way to word it. If they did actually gives instructions that moreless mean the same.
 
  • #281
nitsuj said:
I find it silly to debate who was the driver in this situation.
The courts don't find it silly exactly because the laws can be ambiguous - they were not written with such a case in mind.
 
  • #282
nitsuj said:
Wouldn't the law certainly say the person in the drivers seat was driver?

I posted exactly what the law said in message #274.
 
  • #283
Vanadium 50 said:
I posted exactly what the law said in message #274.

I don't see that as unclear in the context of this case. The lone person in the car, in the drivers seat is the operator of the vehicle; regardless of what tech was engaged, by the operator, to assist in operating the vehicle.
mfb said:
The courts don't find it silly exactly because the laws can be ambiguous - they were not written with such a case in mind.

Laws weren't written for an infinite number of variables. imo the driver was in control of the car. I appreciate you see it differently.
 
  • #284
Vanadium 50 said:
looked it up. The legal term is "operator" and the legal definition is . "Operator" means a person who drives a motor vehicle on a highway, who is in actual physical control of a motor vehicle on a highway or who is exercising control over or steering a vehicle being towed by a motor vehicle.

I think the phrase "actual physical control" is subject to lawyering,
"Actual physical control" may be subject to lawyering, but is "person?" Unless you want to argue that the car's programming is a person, either the check driver was in control or no one was.
 
  • #285
TeethWhitener said:
"Actual physical control" may be subject to lawyering, but is "person?" Unless you want to argue that the car's programming is a person, either the check driver was in control or no one was.
Not the programming, the programmer.

It's a tough call though whether to consider a disabled safety feature an "operator" error or product defect.
 
  • Like
Likes Bystander and nsaspook
  • #286
russ_watters said:
Not the programming, the programmer.

What are the limits of "Actual physical control" by a person?

As was discussed earlier in this thread, would the programmer(s) be negligent if the programming resulted in a reasonably prudent person (human level per current law) driving? If beyond human driving capabilities like LIDAR, night vision or even emergency auto-braking fail to prevent a fatal 'accident', what would be the liability if a reasonably prudent human driver would also fail without those advanced driving capabilities?
 
  • Like
Likes russ_watters
  • #287
mfb said:
Self-driving car hits self-driving robot
The robot wars have begun.

A robot got lost following other robots and ended up on a street where a Tesla pushed it to the side. The driver said he was aware of the robot but didn't brake - probably because no human was in danger. The robot fell over and got damaged.

While it is interesting to learn how the robot made it to the street: The car should have avoided the robot, and it will be interesting to see Tesla's reaction to it.
Apparently this was a staged crash that the media fell for.
 
  • Like
Likes russ_watters and 256bits
  • #288
One can actually see a rope on the robot's arm.
 
  • Like
Likes nsaspook
  • #289
256bits said:
One can actually see a rope on the robot's arm.
Boris_natasha_fearless.jpg
 

Attachments

  • Boris_natasha_fearless.jpg
    Boris_natasha_fearless.jpg
    19.6 KB · Views: 504
  • #290
nsaspook said:
What are the limits of "Actual physical control" by a person?

As was discussed earlier in this thread, would the programmer(s) be negligent if the programming resulted in a reasonably prudent person (human level per current law) driving? If beyond human driving capabilities like LIDAR, night vision or even emergency auto-braking fail to prevent a fatal 'accident', what would be the liability if a reasonably prudent human driver would also fail without those advanced driving capabilities?
Yes, this is a key open question. The collision avoidance and other automation features that are becoming widespread carry disclaimers in the owner's manuals that tell the driver they are responsible and not to rely on those features. It does logically make sense because those features should only kick in after the human has failed to act when they should. But that's just what their lawyers tell them to write. I don't know if they've been tested in litigation.

But in this accident, the level of automation is higher (and thus the level of ongoing control by the driver lower), but a safety feature was purposely disabled by the manufacturer, which is a specific decision by a person.

One thing that is cool but a double-edged sword about Teslas is their contentedness allows them to be updated without the user's knowledge. This creates a risk that a software flaw could be installed overnight in a million cars that suddenly makes them all unsafe of non-functional.
 
  • Like
Likes nitsuj and CWatters
  • #291
CWatters said:
Apparently this was a staged crash that the media fell for.
electrek wrote an article about it: A robot company stages Tesla crash as a PR stunt, media buys it

While the rest of the article is good I'm not sure how good their argument about the passenger's statement is. Yes, the feature has a different name, but if people could correctly describe which software feature they used how we could get rid of half of the IT support staff.
 
  • #293
https://www.forbes.com/sites/bradte...cy-for-fatal-uber-robocar-crash/#316b8624c6d2
NTSB Hearing Blames Humans, Software And Policy For Fatal Uber Robocar Crash - But Mostly Humans

Human errors

When it comes to human fault, the report noted that Herzberg had a “high concentration of methamphetamine” (more than 10 times the medicinal dose) in her blood which would alter her perception. She also had some marijuana residue. She did not look to her right at the oncoming vehicle until 1 second before the crash.

There was also confirmation that the safety driver had indeed pulled out a cell phone and was streaming a TV show on it, looking down at it 34% of the time during her driving session, with a full 5 second “glance” from 6 to 1 seconds prior to the impact.
...
Normally, a pedestrian crossing a high speed street outside a crosswalk would exercise some minimal caution, starting with “look both ways before crossing the street” as we are all taught as children. By all appearances, the crash took place late on a Sunday night on a largely empty road, exactly the sort of situation where a person would normally hear any approaching car well in advance, and check regularly to the right for oncoming traffic, which would be very obvious because of its headlights – obvious even in peripheral vision. Herzberg crossed obliviously, looking over just one second before impact. NTSB investigators attributed this to the meth in her system. They did not know if the concentration in her blood was going up (due to recently taken doses) and altering perception, or coming down (causing unusual moods.)
 
  • Informative
Likes berkeman
  • #294
As an update, the driver has been charged with negligent homicide.

If convicted, this will be the driver's second felony conviction.
 
  • Like
Likes berkeman and russ_watters

Similar threads

Replies
123
Views
11K
Replies
19
Views
11K
Replies
1
Views
10K
Replies
13
Views
3K
Back
Top