First self driving car fatality

  • Thread starter Dr. Courtney
  • Start date
  • Tags
    Car Self
In summary, a self-driving Uber car struck and killed a pedestrian in Arizona. The small experimental installed base of self-driving cars raises concerns with the technology, and the tragedy will be scrutinized like no other autonomous vehicle interaction in the past.
  • #281
nitsuj said:
I find it silly to debate who was the driver in this situation.
The courts don't find it silly exactly because the laws can be ambiguous - they were not written with such a case in mind.
 
Physics news on Phys.org
  • #282
nitsuj said:
Wouldn't the law certainly say the person in the drivers seat was driver?

I posted exactly what the law said in message #274.
 
  • #283
Vanadium 50 said:
I posted exactly what the law said in message #274.

I don't see that as unclear in the context of this case. The lone person in the car, in the drivers seat is the operator of the vehicle; regardless of what tech was engaged, by the operator, to assist in operating the vehicle.
mfb said:
The courts don't find it silly exactly because the laws can be ambiguous - they were not written with such a case in mind.

Laws weren't written for an infinite number of variables. imo the driver was in control of the car. I appreciate you see it differently.
 
  • #284
Vanadium 50 said:
looked it up. The legal term is "operator" and the legal definition is . "Operator" means a person who drives a motor vehicle on a highway, who is in actual physical control of a motor vehicle on a highway or who is exercising control over or steering a vehicle being towed by a motor vehicle.

I think the phrase "actual physical control" is subject to lawyering,
"Actual physical control" may be subject to lawyering, but is "person?" Unless you want to argue that the car's programming is a person, either the check driver was in control or no one was.
 
  • #285
TeethWhitener said:
"Actual physical control" may be subject to lawyering, but is "person?" Unless you want to argue that the car's programming is a person, either the check driver was in control or no one was.
Not the programming, the programmer.

It's a tough call though whether to consider a disabled safety feature an "operator" error or product defect.
 
  • Like
Likes Bystander and nsaspook
  • #286
russ_watters said:
Not the programming, the programmer.

What are the limits of "Actual physical control" by a person?

As was discussed earlier in this thread, would the programmer(s) be negligent if the programming resulted in a reasonably prudent person (human level per current law) driving? If beyond human driving capabilities like LIDAR, night vision or even emergency auto-braking fail to prevent a fatal 'accident', what would be the liability if a reasonably prudent human driver would also fail without those advanced driving capabilities?
 
  • Like
Likes russ_watters
  • #287
mfb said:
Self-driving car hits self-driving robot
The robot wars have begun.

A robot got lost following other robots and ended up on a street where a Tesla pushed it to the side. The driver said he was aware of the robot but didn't brake - probably because no human was in danger. The robot fell over and got damaged.

While it is interesting to learn how the robot made it to the street: The car should have avoided the robot, and it will be interesting to see Tesla's reaction to it.
Apparently this was a staged crash that the media fell for.
 
  • Like
Likes russ_watters and 256bits
  • #288
One can actually see a rope on the robot's arm.
 
  • Like
Likes nsaspook
  • #289
256bits said:
One can actually see a rope on the robot's arm.
Boris_natasha_fearless.jpg
 

Attachments

  • Boris_natasha_fearless.jpg
    Boris_natasha_fearless.jpg
    19.6 KB · Views: 436
  • #290
nsaspook said:
What are the limits of "Actual physical control" by a person?

As was discussed earlier in this thread, would the programmer(s) be negligent if the programming resulted in a reasonably prudent person (human level per current law) driving? If beyond human driving capabilities like LIDAR, night vision or even emergency auto-braking fail to prevent a fatal 'accident', what would be the liability if a reasonably prudent human driver would also fail without those advanced driving capabilities?
Yes, this is a key open question. The collision avoidance and other automation features that are becoming widespread carry disclaimers in the owner's manuals that tell the driver they are responsible and not to rely on those features. It does logically make sense because those features should only kick in after the human has failed to act when they should. But that's just what their lawyers tell them to write. I don't know if they've been tested in litigation.

But in this accident, the level of automation is higher (and thus the level of ongoing control by the driver lower), but a safety feature was purposely disabled by the manufacturer, which is a specific decision by a person.

One thing that is cool but a double-edged sword about Teslas is their contentedness allows them to be updated without the user's knowledge. This creates a risk that a software flaw could be installed overnight in a million cars that suddenly makes them all unsafe of non-functional.
 
  • Like
Likes nitsuj and CWatters
  • #291
CWatters said:
Apparently this was a staged crash that the media fell for.
electrek wrote an article about it: A robot company stages Tesla crash as a PR stunt, media buys it

While the rest of the article is good I'm not sure how good their argument about the passenger's statement is. Yes, the feature has a different name, but if people could correctly describe which software feature they used how we could get rid of half of the IT support staff.
 
  • #293
https://www.forbes.com/sites/bradte...cy-for-fatal-uber-robocar-crash/#316b8624c6d2
NTSB Hearing Blames Humans, Software And Policy For Fatal Uber Robocar Crash - But Mostly Humans

Human errors

When it comes to human fault, the report noted that Herzberg had a “high concentration of methamphetamine” (more than 10 times the medicinal dose) in her blood which would alter her perception. She also had some marijuana residue. She did not look to her right at the oncoming vehicle until 1 second before the crash.

There was also confirmation that the safety driver had indeed pulled out a cell phone and was streaming a TV show on it, looking down at it 34% of the time during her driving session, with a full 5 second “glance” from 6 to 1 seconds prior to the impact.
...
Normally, a pedestrian crossing a high speed street outside a crosswalk would exercise some minimal caution, starting with “look both ways before crossing the street” as we are all taught as children. By all appearances, the crash took place late on a Sunday night on a largely empty road, exactly the sort of situation where a person would normally hear any approaching car well in advance, and check regularly to the right for oncoming traffic, which would be very obvious because of its headlights – obvious even in peripheral vision. Herzberg crossed obliviously, looking over just one second before impact. NTSB investigators attributed this to the meth in her system. They did not know if the concentration in her blood was going up (due to recently taken doses) and altering perception, or coming down (causing unusual moods.)
 
  • Informative
Likes berkeman
  • #294
As an update, the driver has been charged with negligent homicide.

If convicted, this will be the driver's second felony conviction.
 
  • Like
Likes berkeman and russ_watters
<h2>1. What caused the first self-driving car fatality?</h2><p>The first self-driving car fatality occurred in 2016 when a Tesla Model S collided with a tractor-trailer. The cause of the accident was determined to be a combination of factors, including the failure of the car's sensors to detect the white side of the truck against a bright sky, and the driver's lack of attention and failure to take control of the vehicle.</p><h2>2. How common are self-driving car fatalities?</h2><p>Self-driving car fatalities are extremely rare. As of 2021, there have been a total of 5 reported fatalities involving self-driving cars, with millions of miles driven by these vehicles. In comparison, there were over 38,000 motor vehicle fatalities in the United States in 2019 alone.</p><h2>3. Are self-driving cars safe?</h2><p>Overall, self-driving cars have been shown to be safer than human-driven cars. They eliminate the potential for human error, which is a leading cause of car accidents. However, as with any new technology, there are still risks and challenges that need to be addressed and improved upon.</p><h2>4. Who is responsible in the event of a self-driving car fatality?</h2><p>Determining responsibility in the event of a self-driving car fatality can be complex and may involve multiple parties. In most cases, the manufacturer of the self-driving technology and the owner of the vehicle may share responsibility. Additionally, the driver may also be held responsible if they were not following proper safety protocols.</p><h2>5. What measures are being taken to prevent future self-driving car fatalities?</h2><p>Following the first self-driving car fatality, companies and regulators have taken steps to improve the safety of self-driving cars. This includes stricter regulations, improved technology and safety features, and increased testing and monitoring. Additionally, education and awareness about self-driving cars and their capabilities are being emphasized to prevent human error and improve overall safety.</p>

1. What caused the first self-driving car fatality?

The first self-driving car fatality occurred in 2016 when a Tesla Model S collided with a tractor-trailer. The cause of the accident was determined to be a combination of factors, including the failure of the car's sensors to detect the white side of the truck against a bright sky, and the driver's lack of attention and failure to take control of the vehicle.

2. How common are self-driving car fatalities?

Self-driving car fatalities are extremely rare. As of 2021, there have been a total of 5 reported fatalities involving self-driving cars, with millions of miles driven by these vehicles. In comparison, there were over 38,000 motor vehicle fatalities in the United States in 2019 alone.

3. Are self-driving cars safe?

Overall, self-driving cars have been shown to be safer than human-driven cars. They eliminate the potential for human error, which is a leading cause of car accidents. However, as with any new technology, there are still risks and challenges that need to be addressed and improved upon.

4. Who is responsible in the event of a self-driving car fatality?

Determining responsibility in the event of a self-driving car fatality can be complex and may involve multiple parties. In most cases, the manufacturer of the self-driving technology and the owner of the vehicle may share responsibility. Additionally, the driver may also be held responsible if they were not following proper safety protocols.

5. What measures are being taken to prevent future self-driving car fatalities?

Following the first self-driving car fatality, companies and regulators have taken steps to improve the safety of self-driving cars. This includes stricter regulations, improved technology and safety features, and increased testing and monitoring. Additionally, education and awareness about self-driving cars and their capabilities are being emphasized to prevent human error and improve overall safety.

Similar threads

  • General Discussion
4
Replies
123
Views
10K
Replies
22
Views
972
Replies
10
Views
2K
  • General Discussion
Replies
1
Views
8K
Replies
1
Views
1K
  • General Discussion
Replies
29
Views
9K
  • Special and General Relativity
Replies
13
Views
2K
Replies
2
Views
6K
  • STEM Academic Advising
Replies
25
Views
7K
Back
Top