How Safe Are Self-Driving Cars After the First Fatal Accident?

  • Thread starter Thread starter Dr. Courtney
  • Start date Start date
  • Tags Tags
    Car Self
Click For Summary
A self-driving Uber vehicle struck and killed a pedestrian in Arizona, marking the first fatal incident involving autonomous cars. The event raises significant concerns about the safety and readiness of self-driving technology, especially given the limited number of vehicles in operation. Discussions highlight the potential for engineers to analyze the incident thoroughly, which could lead to improvements across all autonomous vehicles. There are debates about the legal implications of the accident, particularly regarding the accountability of the vehicle's operator and the technology itself. Ultimately, the incident underscores the complexities of integrating self-driving cars into public spaces and the necessity for rigorous safety standards.
  • #91
nsaspook said:
They look bad like most SUV headlights systems on a dark object at night.

Here in the UK IIRC the regulation for dipped headlights is that they must illuminate a distance of 50m (~165 feet) in clear weather conditions. I imagine the US has similar laws but the video and your statement seem to suggest otherwise. Looking back at the video the headlights are only illuminating two road marks ahead of the vehicle, which isn't even 10m.

I think there are a couple of serious questions to be answered here.

1) Given that Elaine had reached the far lane while crossing at a slow/steady pace and the vehicle had not turned a corner or any other blindspot was she picked up by the LIDAR? If so at what distance, what corrective measures (if any) did the car attempt to make and if she was not then why not.

2) Again given that Elaine didn't jump out suddenly and the car had not turned the corner why did the driver not notice her? What measures have Uber taken to ensure their drivers understand the need to constantly be aware of the situation.

Given the video I suspect that if the car was being driven by a human and they had their full beams on (given that it's a dark road with no oncoming traffic) they would have seen Elaine at a distance sufficient to slow down, and Elaine would have had a visual warning of the car coming at a greater distance. If the car doesn't need the headlights to drive itself then there seems to be a double failure here, 1) whatever caused the car to not see her in sufficient distance and 2) the driver/car failing to put on high beams, possibly due to complacency wrt to the autopilot's capability.
 
  • Like
Likes HAYAO and russ_watters
Physics news on Phys.org
  • #92
Ryan_m_b said:
the driver/car failing to put on high beams, possibly due to complacency wrt to the autopilot's capability.
Probably not.
High beams blinds the driver in the car ahead of you, who then can't see as well.
Rear view and side mirrors and all that.
So everyone in the city drives with low beam.
Imagine high beam behind a police car, not wise.
 
  • Like
Likes nsaspook
  • #93
Spinnor said:
Would limiting the cars to daylight hours of operation help? Do they lose some of their sensing "abilities" at night?
Uber is in the taxi business.
Curtail the business at night in unrealistic, if these cars are going to be allowed on the road.
The cars will be driving at night, so they have to be tested at night.
 
  • #94
Car driving is the most dangeres thing in our world. Be care of uresefl guys
 
  • #95
256bits said:
Uber is in the taxi business.

They are also teaching robots to drive. Restricted to day time driving there is a good chance that an alert backup driver would have prevented this death. What was the women backup driver looking down at?
 
  • #96
Perhaps the back-up driver should have been a robot!
No wandering attention.
 
  • Like
Likes 256bits
  • #97
Spinnor said:
They are also teaching robots to drive. Restricted to day time driving there is a good chance that an alert backup driver would have prevented this death. What was the women backup driver looking down at?
Backup driver-
A human driver avoiding this incident we will never know.
With nothing to do but be there, look out the window and daydream a lot I would suspect. Attention span in that type of job...
Bored kids would say, "There's nothing to do!".
How long is a "testing run" between breaks to get out and stretch ones legs.
An inactive person with little stimulation becomes lethargic after the routine sets in.
I am actually surprised the person hadn't fallen asleep, especially since the sun was down.

I have not seen any testing on human backup drivers, and their value, it is just a given that one is needed.
Is that just a bureaucratic decision without any basis, but built on false assumptions?
Certainly there are recorded anecdotal incidents where the driver took over control of the vehicle.
But in any "experiment", which is what this is, there is no control group for comparison of what the driverless vehicle would do on its own, what a human driver would do, and what a human-driverless-vehicle would do. The testing is on the fly in the real world which is what you are getting at.

Restricted to day time driving -
Right. In hindsight, the technology for this vehicle was not ready for night driving.
But they do have to test at night to see what happens, since the cars will be used at night.
And it could be not even ready for day time driving.
The program, the sensors, something did not live up to expectations.
For all the hype that surrounds the "safer than human drivers", this was a complete failure.
If the lady has a family, I hope they sue for $500 million bucks, just to get this thing on the right track, instead of the what it seems to be the first one past the starting gate wins.
The promotional material is speaking to the converted.In the future, when these cars will be licensed for the road, I will predict that the mix of 10%, 20% etc will produce different outcomes than just 1 amongst many.
Ever hear of road rage.
What's better than to shoot the tires out from under a self-driving car that doesn't know how to drive.
Stay tuned.
 
  • #98
There seems to be some dispute circulating online regarding Uber's dashcam footage, which might explain why the headlights look to be defective/illegally modified. The road in the dashcam footage is pitch black with the cyclist appearing last minute. However there are regular citizens posting pictures and videos of that road at night to show that it is in fact very well illuminated. Either Uber's dashcam is of extremely poor quality or the footage has been altered somehow to make it appear darker than it was and give the impression Elaine jumped out of nowhere.

This video was taken by a member of the public at roughly the same period of night, the spot of the accident is passed about the 30 second mark.

 
  • Like
Likes berkeman
  • #99
I don't think the crash video was defective or modified, just poor. If we remove the automation could a reasonable person avoid that accident under these conditions (dark object in the dark part of the road).


#1 Pedestrian failed to cross safely or properly. In most parts of the world that's case closed.

Do we really need red flags and two safety drivers for driver-less cars for bad 'corner cases' ("Anything can go wrong will go wrong") ?
5845385.jpg
 

Attachments

  • 5845385.jpg
    5845385.jpg
    60 KB · Views: 407
  • #100
nsaspook said:
I don't think the crash video was defective or modified, just poor. If we remove the automation could a reasonable person avoid that accident under these conditions (dark object in the dark part of the road).

Going by that video I would say yes absolutely. The area of the crash is visible for approximately ten seconds in that video. Aside from the fact this shows the safety driver should have seen Elaine and intervened the Uber had Radar and LIDAR, neither of which should have been affected by the lighting of the road. Yet despite it being a straight road for a significant stretch between the car and Elaine it did not appear to make any attempt to slow down.

nsaspook said:
#1 Pedestrian failed to cross safely or properly. In most parts of the world that's case closed.

Err I'm not so sure on that. IIRC there are only two nations in the world with jaywalking laws, in the others drivers are expected to do to give way to pedestrians in a safe manner. In the UK for example pedestrians are legally considered to have right of way on the roads excepting those that specifically ban pedestrian use (like motor ways). In the event of an accident partial or full blame is only placed on the pedestrian if seeing and avoiding them can be shown to be impossible in the time given, assuming the driver is otherwise driving safely (i.e. not speeding, lights at appropriate setting, not on phone etc).
 
  • Like
Likes HAYAO and russ_watters
  • #101
Ryan_m_b said:
There seems to be some dispute circulating online regarding Uber's dashcam footage, which might explain why the headlights look to be defective/illegally modified. The road in the dashcam footage is pitch black with the cyclist appearing last minute. However there are regular citizens posting pictures and videos of that road at night to show that it is in fact very well illuminated. Either Uber's dashcam is of extremely poor quality or the footage has been altered somehow to make it appear darker than it was and give the impression Elaine jumped out of nowhere.

This video was taken by a member of the public at roughly the same period of night, the spot of the accident is passed about the 30 second mark.


I was thinking something similar(I mentioned it earlier), but let's not get conspiracytheoryish about this: cameras are inherently inferior to our eyes for this purpose (range of brightness) and proper automatic adjustment is difficult at best. That's why HDR photography was invented. So the quality might be poor, but that is not unusual. The video you linked looks to me like it is using a brightness boosting/ leveling technology.

So my question is; are they using cameras for obstacle avoidance? previously you mentioned LIDAR: was that speculation or do you know it was using lidar?
 
  • #102
russ_watters said:
I was thinking something similar(I mentioned it earlier), but let's not get conspiracytheoryish about this: cameras are inherently inferior to our eyes for this purpose (range of brightness) and proper automatic adjustment is difficult at best. That's why HDR photography was invented. So the quality might be poor, but that is not unusual.

I agree about not getting conspiratorial but felt it worth mentioning as the footage has been the basis for a lot of the "she came out of the dark" comments.

russ_watters said:
So my question is; are they using cameras for obstacle avoidance? previously you mentioned LIDAR: was that speculation or do you know it was using lidar?

Not only did the vehicle have LIDAR and Radar but the maker of the LIDAR system has come out and said they can't understand how she wouldn't have been detected by it.

https://www.forbes.com/sites/alanoh...ers-failure-to-avoid-pedestrian/#2da6d8cb5cc2

Also of interest is this footage of the investigators looking into the accident. Using the same SUV involved in the crash they can be seen driving that same stretch of road at speed and attempting to break before reaching the point Elaine was standing. From the video it seems there was more than enough space to break in time.

https://twitter.com/LaurenReimerTV/status/977077647543955458

Tentatively my thoughts that this is a failure of two parts (one the car not responding and two the driver not overriding the car) are looking to be correct.
 
  • #103
256bits said:
Backup driver-
A human driver avoiding this incident we will never know.
With nothing to do but be there, look out the window and daydream a lot I would suspect. Attention span in that type of job...
Bored kids would say, "There's nothing to do!".
How long is a "testing run" between breaks to get out and stretch ones legs.
An inactive person with little stimulation becomes lethargic after the routine sets in.
I am actually surprised the person hadn't fallen asleep, especially since the sun was down.

I have not seen any testing on human backup drivers, and their value, it is just a given that one is needed.
Spinnor said:
What was the [] backup driver looking down at?
The news is calling him a "safety driver". Presumably that means his primary function is preventing just this sort of accident. But until we know what he was doing or what his full job description was, it is difficult to know how much blame he has. If he was on Facebook, then he has considerable fault. if he was performing Uber-assigned systems monitoring, then he has none.
 
  • Like
Likes Ryan_m_b
  • #104
She's crossing from left to right; programming expects her to enter roadway from right?
 
  • #105
That assumption cannot be made because people cross roads at corners all the time from the left.
 
  • Like
Likes russ_watters
  • #106
Ryan_m_b said:
Also of interest is this footage of the investigators looking into the accident. Using the same SUV involved in the crash they can be seen driving that same stretch of road at speed and attempting to break before reaching the point Elaine was standing. From the video it seems there was more than enough space to break in time.

https://twitter.com/LaurenReimerTV/status/977077647543955458

Tentatively my thoughts that this is a failure of two parts (one the car not responding and two the driver not overriding the car) are looking to be correct.

I would say the video above only proves human response time in a known stopping condition. It has questionable usefulness for what actually happened even if a human was driving. I'm not exactly sure what the technical basis for over-driving headlights would be here. 40 MPH on good low beams still gives plenty of time to brake with a perfect human response. I also don't know the legal limitations of over-driving headlights because I don't see the speed limit enforced to 40 MPH on roads with a 70+ MPH limit at night even with marginal lighting.

IMO the poor homeless lady specifically chose that crossing location for a 'stealth' crossing because it's just beyond a street light boundary that could have been just as easily chosen. She wore dark non-reflective clothing with no bike reflectors to reduce her detection cross-section for humans, video or detection systems intentionally so she could hide easily as a homeless person at risk. I'm not blaming her, I'm just saying her actions caused her death.
 
  • Like
Likes berkeman
  • #107
russ_watters said:
I was thinking something similar(I mentioned it earlier), but let's not get conspiracytheoryish about this: cameras are inherently inferior to our eyes for this purpose (range of brightness) and proper automatic adjustment is difficult at best. That's why HDR photography was invented. So the quality might be poor, but that is not unusual. The video you linked looks to me like it is using a brightness boosting/ leveling technology.

So my question is; are they using cameras for obstacle avoidance? previously you mentioned LIDAR: was that speculation or do you know it was using lidar?

From the little information I've been able to find the safety driver was looking down at the object detection (that combined all sensors including LiDAR) display on a laptop or similar computer as it was her job to monitor the system.

Her bike crossing timing as a detection target in relationship to the background and angle of approach of the car might have reduced the unique human signature as it blended with a bike with various sizes of plastic bags strung over it. I wonder how many pictures of homeless people walking laden bikes are in the images databases for a high confidence target classification? It's possible the object detection system generated a false negative of a person or bike while classifying it as a more benign object like a slowly moving trash bag near the side of the road until it was too late to stop or avoid.
https://pdfs.semanticscholar.org/cd36/512cbb2701dccda3c79c04e6839d9f95852b.pdf

homeless-man-searching-for-redeemable-containers-in-a-trash-can-3rd-j477yt.jpg
 

Attachments

  • homeless-man-searching-for-redeemable-containers-in-a-trash-can-3rd-j477yt.jpg
    homeless-man-searching-for-redeemable-containers-in-a-trash-can-3rd-j477yt.jpg
    46.3 KB · Views: 312
  • #108
The job of the object detection system when an object appears in front of the car is is to tell the car to stop after all it could be a boulder or a moose.
 
  • #109
gleem said:
The job of the object detection system when an object appears in front of the car is is to tell the car to stop after all it could be a boulder or a moose.

What if it's a wind driven trash-bag or a pages from a newspaper 'flying' across the road and you have a cement truck behind you at 40 MPH? Classification of objects as a boulder or a moose is very important beyond detection of an object because the response should be different for 'benign' objects seen by the detection system. Executing a emergency stop for all objects detected is dangerous too.
 
  • #110
The cement truck is driving too close for conditions. BTW we didn't stop in our car for what appeared to be piece of rubber tire and as it turned out to be something more substantial cracking the differential housing. If the object appears too quickly even a human may initiate an emergency stop or maybe even worse swerve to try and avoid it as we are often instruction not to do as in the case of an animal.
 
  • #111
gleem said:
The cement truck is driving too close for conditions. BTW we didn't stop in our car for what appeared to be piece of rubber tire and as it turned out to be something more substantial cracking the differential housing. If the object appears too quickly even a human may initiate an emergency stop or maybe even worse swerve to try and avoid it as we are often instruction not to do as in the case of an animal.

My point exactly on why classification is important, if the cement truck is driving too close for conditions we need to be sure the risk of an emergency brake sequence is needed. The failure here to be seems to be a classification error because pure detection should have been easy with a functional Lidar system.

These systems are less robust than most people think.
 
Last edited:
  • Like
Likes collinsmark
  • #112
russ_watters said:
The news is calling him a "safety driver". Presumably that means his primary function is preventing just this sort of accident. But until we know what he was doing or what his full job description was, it is difficult to know how much blame he has. If he was on Facebook, then he has considerable fault. if he was performing Uber-assigned systems monitoring, then he has none.
The methodology used here for testing driverless cars is not foolproof.
In fact, consider the following:
1. Premise: the technology in considered mature enough to allow the vehicle to operate in real world situations.
2. if the technology is mature enough then the testing phase is unnecessary
3. a 'safety driver' occupies the vehicle during the unnecessary testing phase
4. the unnecessary testing phase then becomes a test of the actions and responses of the 'safety driver'.
5. since the 'safety driver' is a human, the testing becomes an actual study in human behavior.
 
  • #113
In the aircraft world there is a lot of concern about the lack of manual flying due to excessive use of automation. They think pilots are loosing the skills needed to fly. There have been a few accidents due to pilots failing to respond correctly when the auto pilot suddenly hands back control. Also when the autopilot does the wrong thing as a result of information from faulty sensors.

In one case a radio altimeter failed and indicated -8 feet constantly. The pilots recognised it was faulty and the autopilot appeared to ignore the faulty data and work normally. Then as they came into land the autopilot suddenly decided that -8 feet meant the aircraft must be at the right height to shut off the engines and slow for landing. The aircraft crashed 1km short or the runway.

I own a 10 year old car and bits fail all the time without warning.
 
Last edited:
  • #114
NYT article of interest: https://www.nytimes.com/2018/03/23/technology/uber-self-driving-cars-arizona.html

Also, no one seems to be picking up on the fact that the "safety driver", "operator" or whatever, was a convicted felon who spent time in jail for armed robbery...

Why would Uber hire someone like this to be part of a research project unless they were trying to cut corners financially, and hire people at dirt-cheap wages? This makes me very suspicious of where else they were trying to cut corners...
 
  • #115
dipole said:
NYT article of interest: https://www.nytimes.com/2018/03/23/technology/uber-self-driving-cars-arizona.html

Also, no one seems to be picking up on the fact that the "safety driver", "operator" or whatever, was a convicted felon who spent time in jail for armed robbery...

Why would Uber hire someone like this to be part of a research project unless they were trying to cut corners financially, and hire people at dirt-cheap wages? This makes me very suspicious of where else they were trying to cut corners...

https://www.jailstojobs.org/6017-2/
https://www.uber.com/info/policy/criminal-justice-reform/
 
  • Like
Likes OmCheeto
  • #116
I'm sure there must be some kind of 'black box' in these autodrive cars, just as there are on aircraft.
That means there is actual data which can reveal what the car's system was doing at the time.
If there is not such a black box, then why the hell isn't there?
 
  • #117
Isaac Asimov, where are you in our time of need?

https://en.m.wikipedia.org/wiki/Three_Laws_of_Robotics

Following is a quote from the Wikipedia article. Note particularly #5 - sounds like a good idea.

"In October 2013, Alan Winfield suggested at an EUCog meeting[55] a revised 5 laws that had been published, with commentary, by the EPSRC/AHRC working group in 2010.:[56]

  1. Robots are multi-use tools. Robots should not be designed solely or primarily to kill or harm humans, except in the interests of national security.
  2. Humans, not Robots, are responsible agents. Robots should be designed and operated as far as practicable to comply with existing laws, fundamental rights and freedoms, including privacy.
  3. Robots are products. They should be designed using processes which assure their safety and security.
  4. Robots are manufactured artefacts. They should not be designed in a deceptive way to exploit vulnerable users; instead their machine nature should be transparent.
  5. The person with legal responsibility for a robot should be attributed."
 
  • #118
256bits said:
The methodology used here for testing driverless cars is not foolproof.
In fact, consider the following:
1. Premise: the technology in considered mature enough to allow the vehicle to operate in real world situations.
2. if the technology is mature enough then the testing phase is unnecessary
3. a 'safety driver' occupies the vehicle during the unnecessary testing phase
4. the unnecessary testing phase then becomes a test of the actions and responses of the 'safety driver'.
5. since the 'safety driver' is a human, the testing becomes an actual study in human behavior.
I don't understand, starting at #2: the "testing phase" happens before the technology is mature. It's what makes the technology mature! The idea that they are testing immature technology on real city streets with no government oversight is both bizarre and scary to me. It sounds like you are suggesting a catch-22 based on a premise that they have to be tested on real city streets before they are ready to be driven on city streets. That just isn't the case and shouldn't be acceptable (gotta break a few eggs? Only if they are fake eggs). See:
The ride-hailing giant published a new video earlier this month showing a glimpse of its fake city where the company's robocars learn how to drive in the real world.

Called Almono, the fake city is built on an old steel mill site along the Monongahela River in the Hazelwood neighborhood of Pittsburgh. It has a giant roundabout, fake cars, and roaming mannequins that jump out into the street without warning. [emphasis added]
http://www.businessinsider.com/ubers-fake-city-pittsburgh-self-driving-cars-2017-10

There is no excuse for Uber's car to not be able to handle such a straightforward/common accident scenario. To me, this is a homicide case. And to me, Tesla's fatal accident wasn't far behind this.
 
  • #119
russ_watters said:
I was thinking something similar(I mentioned it earlier), but let's not get conspiracytheoryish about this: cameras are inherently inferior to our eyes for this purpose (range of brightness) and proper automatic adjustment is difficult at best. That's why HDR photography was invented. So the quality might be poor, but that is not unusual. The video you linked looks to me like it is using a brightness boosting/ leveling technology.
I don't believe it's actually true that modern cameras are inferior to the human eye for dynamic range. Automatic adjustment when you have oncoming headlights mixed with darkness is a difficulty, however. Also, some kind of thresholding is probably necessary a fault since lots of shadow noise probably won't help your detection/tracking algorithms.

So my question is; are they using cameras for obstacle avoidance? previously you mentioned LIDAR: was that speculation or do you know it was using lidar?
More broadly, it's hard to speculate usefully without a very good idea of what the sensors were receiving at the time.
 
  • Like
Likes nsaspook
  • #120
sandy stone said:
Isaac Asimov, where are you in our time of need?
To be honest, I don't see any of this as a "need" - with the exception of #1 which is a political question, these are already legal realities. Robots don't change anything that would cause these to be needed to be said. And they won't until/unless they become legally recognized sentient AI.
 

Similar threads

  • · Replies 123 ·
5
Replies
123
Views
12K
  • · Replies 19 ·
Replies
19
Views
12K
  • · Replies 1 ·
Replies
1
Views
10K
  • · Replies 13 ·
Replies
13
Views
4K