First fatal accident involving a car in self driving mode

In summary, the Tesla self-driving car was involved in a fatal crash. The car was programmed to ignore things that only have components "in the air" without a ground connection (e.g. overhead road signs).
  • #71
mheslep said:
That sounds completely unreasonable, but perhaps I misunderstand. You suggest the driver monitor every pending significant action of the would-be autonomous vehicle, and in the, say, 2 or 3 seconds before a problem, if the vehicle fails to react in the first second or so the human driver should always stand ready to pounce?
At least for now, the driver should watch the street and be ready to take over if necessary. Yes. That's what Tesla requires the drivers to do, and for now it is a reasonable requirement.
The situation is not that different if you drive yourself on a highway for example: typically you adjust your speed and course slightly once in a while to stay in your lane and to keep the distance to the previous car, or to stay at your preferred speed. If an emergency situation occurs, you suddenly change that, and brake and/or steer. With Tesla, the main difference is that the slight speed and course adjustments are done by the car. It can also avoid emergency situations by braking before it gets dangerous. If an emergency situation comes up, brake - but chances are good the car starts braking before you can even react: an additional gain in safety.
mheslep said:
The case of this truck turning across the highway *was* routine.
Depends on how much warning time the car/driver had, something I don't know. Do you? If there was enough time, any driver watching the street could have started braking before it got dangerous. I mean, what do you expect drivers to do if they watch the street and the car does something wrong, wait happily until the car hits the truck at full speed?
 
Physics news on Phys.org
  • #72
The WSJ article out today on the Tesla autopilot is superb, as the WSJ frequently is on industry topics. The article nails the game Tesla is playing with the autopilot, putting out official disclaimer due diligence statements on the autopilot on one hand, and on the other hand promoting its capabilities. There have been several severe accidents and some near misses avoided by drivers in the autopilot vehicles, though the one fatality.

The wonder-vehicle line:
In March 2015, Mr. Musk told reporters the system could make it possible in the future for Tesla vehicles to drive themselves “from parking lot to parking lot.”...

In April of this year, Mr. Musk retweeted a video by Mr. Brown that shows the Autopilot of his Model S preventing a crash...

Tesla has said its technology is the most advanced system on the market,...

The due-diligence line:
Tesla said the self-driving system worked exactly as it should have, citing data from the car’s “historical log file,” a document signed by Mr. Bennett, and an owner’s manual declaring the technology “cannot detect all objects and may not brake/decelerate for stationary vehicles.”...

Owner’s manuals state that the technology “is designed for your driving comfort and convenience and is not a collision warning or avoidance system.”...

customers are told explicitly what Autopilot is and isn’t capable of doing.
Oh, if it was in the owners manual, that's ok then. Ford should have put notice in their Pinto owner's manual years ago, that rear impacts could ignite the gas tank, and thereby avoided the 1.5 million vehicle recall.
 
  • Like
Likes russ_watters, Jaeusm and nsaspook
  • #73
mheslep said:
Oh, if it was in the owners manual, that's ok then. Ford should have put notice in their Pinto owner's manual years ago, that rear impacts could ignite the gas tank, and thereby avoided the 1.5 million vehicle recall.
You can use a Tesla without autopilot. You cannot use a Pinto without gas tank.
 
  • #75
Another 'autopilot' incident. This one seems to demonstrate how completely some people believe the autopilot hype.

http://abcnews.go.com/Business/wireStory/feds-seek-autopilot-data-tesla-crash-probe-40515954
The company said the Model X alerted the driver to put his hands on the wheel, but he didn't do it. "As road conditions became increasingly uncertain, the vehicle again alerted the driver to put his hands on the wheel. He did not do so and shortly thereafter the vehicle collided with a post on the edge of the roadway," the statement said.

The car negotiated a right curve and went off the road, traveling about 200 feet on the narrow shoulder, taking out 13 posts, Shope said.

The trooper did not cite the driver, saying he believed any citation would be voided because of the driver's claim that the car was on Autopilot.

https://teslamotorsclub.com/tmc/threads/my-friend-model-x-crash-bad-on-ap-yesterday.73308/
 
  • #78
mheslep said:
Musk has an auto response ready for autopilot accidents: "Not material"
http://fortune.com/2016/07/05/elon-musk-tesla-autopilot-stock-sale/
I'm not sure of the legality/insider trading implications of that, but it will be interesting to see if that goes anywhere. Something that has bothered me since the start but was more strongly/specifically worded in that article is this:
He [Musk] continued, “Indeed, if anyone bothered to do the math (obviously, you did not) they would realize that of the over 1M auto deaths per year worldwide, approximately half a million people would have been saved if the Tesla autopilot was universally available. Please, take 5 mins and do the bloody math before you write an article that misleads the public.”
The implicit claim (also implied in this thread) is that Tesla's fatalities per 100 million miles stat is comparable to the NHTSA's. Is it? That would surprise me a lot, given in particular Tesla's insistence that the autopilot should only be used in easy driving situations.

Here's the NHTSA's stats:
http://www-fars.nhtsa.dot.gov/Main/index.aspx

They don't differentiate different driving regimes (highway vs local roads for example), but they do include pedestrians and motorcyclists, which wouldn't seem an appropriate comparison. But given the low number of Tesla miles driven, if a Tesla kills a motorcyclist or pedestrian in the next couple of years, their fatality rate for those categories would be vastly higher than average.
 
  • #79
russ_watters said:
... Something that has bothered me since the start but was more strongly/specifically worded in that article is this:

The implicit claim (also implied in this thread) is that Tesla's fatalities per 100 million miles stat is comparable to the NHTSA's. Is it? That would surprise me a lot, given in particular Tesla's insistence that the autopilot should only be used in easy driving situations.

Here's the NHTSA's stats:
http://www-fars.nhtsa.dot.gov/Main/index.aspx
I find Musk's response to Fortune about accident rate as quoted both highly arrogant and invalid. A few researchers in the autonomous vehicle field have already pointed out that comparing Tesla data miles of a heavy and strongly constructed sedan to the variation in the world's vehicle including the like of motorcycles is invalid.

The arrogance comes about in justifying the macro effect of autonomous vehicles on accident rates at large while AVs might well take more lives in sub-groups of drivers.Musk is not entitled to make that decision. That is, several factors impact the accident rate given by NHTSA. One the highest is still alcohol, a factor in a ~third of all US accidents.
https://www.google.com/search?q=percentage+of+vehicle+accidents+involving+alcohol
Clearly, AVs, despite their under crossing truck blindness, could greatly lower the alcohol related accidents and thereby make AV stats look better.

But what about, say, drivers hauling their kids to school? Wild guess here, but among that group I'm guessing alcohol related accidents is zero point nothing, their tendency to do other things that raise the accident rate are also well below normal, like exceed the posted by 20 mph, run stops, let their tires go bald. Their accident rate is *not* the normal Musk says his AV can beat, saving the millions. If one of them in an AV drives under a truck and continues on as if nothing happened? Not material to Musk.
 
  • #80
NHTSA files formal request for crash information (about all crashes):
http://www.foxnews.com/leisure/2016/07/13/feds-examine-how-tesla-autopilot-reacts-to-crossing-traffic/?intcmp=hpffo&intcmp=obnetwork

It is due August 26 and carries fines if not responded to.
 
  • #81
mheslep said:
If one of them in an AV drives under a truck and continues on as if nothing happened?
Automated cars make different accidents. The dataset is too small to make precise comparisons to human drivers, but human drivers make accidents an automated car won't make (because it is never drunk for example), while automated vehicles make accidents a human driver won't make. Luckily you can combine both in a Tesla: read the manual and pay attention to the road if you use the autopilot. That way you can limit the accident rate to accidents where both autopilot and the human fail to react properly.
 
  • #82
mfb said:
Automated cars make different accidents. The dataset is too small to make precise comparisons to human drivers,
Yes, this is why Musk needs to stop making condescending comments about doing the math on AVs.

but human drivers make accidents an automated car won't make (because it is never drunk for example), while automated vehicles make accidents a human driver won't make. Luckily you can combine both in a Tesla: read the manual and pay attention to the road if you use the autopilot. That way you can limit the accident rate to accidents where both autopilot and the human fail to react properly.

One can't claim credit on the one hand for lower accident rates since an AV is never drunk while on the other demand drunk humans pay attention, else AV mistakes are the fault of humans not paying attention.

More generally, humans are lousy at paying attention while not being actively involved. Researchers in AVs keep repeating this.
 
Last edited:
  • Like
Likes nsaspook
  • #83
mheslep said:
One can't claim credit on the one hand for lower accident rates since an AV is never drunk while on the other demand drunk humans pay attention, else AV mistakes are the fault of humans not paying attention.
I would expect an AV with a drunk driver paying attention (sort of) to be better than a drunk driver driving. Both is illegal, of course.
mheslep said:
More generally, humans are lousy at paying attention while not being actively involved.
I wonder if you could involve them via some mandatory mini-games that involve watching traffic. Press a button every time a red car passes by, or whatever.
 
  • #84
mheslep said:
More generally, humans are lousy at paying attention while not being actively involved. Researchers in AVs keep repeating this.

The Tesla 'Autopilot' operates in the man-machine control loop at the point (physical driving process) where the brain during driving normally operates primarily below full awareness in the land of instincts and impulses. We learn to trust our internal human 'Autopilot' to evaluate the environment and warn the fully aware driving brain of danger in time to avoid problems.

The problem with the Tesla system IMO is trust. Tesla has managed to create a system (the entire human-machine interface) that seems so good it can be trusted to drive even if the manual says NO. Ok, we have little game clues to maintain driving focus but as long as the car handles those subconscious driving activities without intervention we know it's just a game instead of real driving.

https://hbr.org/2016/07/tesla-autopilot-and-the-challenge-of-trusting-machines
That decision — to trust or not — isn’t necessarily a rational one, and it’s not based only on the instructions people are given or even the way the machine is designed; as long as there have been cars, there have been people who anthropomorphize them. But the addition of technology that starts to feel like intelligence only furthers that inclination. We trust machines when we see something like ourselves in them — and the more we do that, the more we’re likely to believe they’re as capable as we are.

http://arstechnica.com/cars/2016/05...las-autopilot-forced-me-to-trust-the-machine/
It takes a while to get used to this feeling. Instead of serving as the primary means of direction for a car, you're now a meat-based backup and failsafe system. Instincts and impulses formed by more than two decades behind the wheel scream out a warning—"GRAB THE WHEEL NOW OR YOU'LL DIE"—while the rational forebrain fights back. Eventually, the voices quiet as the car starts to prove itself. When the road curves, the car follows. If the car is ever going too fast to negotiate the curve, it slows down and then accelerates smoothly back out of the turn.
 
Last edited:
  • #86
nsaspook said:
The Tesla 'Autopilot' operates in the man-machine control loop at the point (physical driving process) where the brain during driving normally operates primarily below full awareness in the land of instincts and impulses. We learn to trust our internal human 'Autopilot' to evaluate the environment and warn the fully aware driving brain of danger in time to avoid problems.

The problem with the Tesla system IMO is trust. Tesla has managed to create a system (the entire human-machine interface) that seems so good it can be trusted to drive even if the manual says NO. Ok, we have little game clues to maintain driving focus but as long a the car handles those subconscious driving activities without intervention we know it's just a game instead of real driving.
That's basically how I see it, though I would describe it as "responsibility" instead of trust (but it could be trust in who has responsibility...).

New back-up safety features like brake assist/overrides don't need to have that trust/responsibility because the person is still supposed to be fully in-charge: The computer assist happens after the human has already failed. Such systems can only improve safety because the human never has to think about who has responsibility.

On the other hand, systems where the computer has primary responsibility and the person "back-up" responsibility are inherrently flawed/self-contradictory because the person doesn't have the reaction speed necessary to take over if the computer fails. So the person's responsibility has to be either all or nothing.

This problem can be mitigated somewhat but can't be fixed by requiring a person to keep their hands on the wheel or with a notification that the driver has to take back control and that applies even to radar assisted cruise controls. Because it isn't the failures that the computer knows about that are the biggest problem, it is the ones it doesn't know about that are the biggest problem.

Sooner or later, a Tesla will notify its driver to take back control just as its front wheels go over the cliff. Elon won't be successful in using the "hey, it warned you!" defense there.
 
  • Like
Likes mheslep
  • #87
russ_watters said:
Consumer Reports calls on Tesla to disable autopilot:

http://www.usatoday.com/story/money...umer-reports-tesla-motors-autopilot/87075956/

Regaining Control
Research shows that humans are notoriously bad at re-engaging with complex tasks after their attention has been allowed to wander. According to a 2015 NHTSA study (PDF), it took test subjects anywhere from three to 17 seconds to regain control of a semi-autonomous vehicle when alerted that the car was no longer under the computer's control. At 65 mph, that's between 100 feet and quarter-mile traveled by a vehicle effectively under no one's control.

This is what’s known by researchers as the “Handoff Problem.” Google, which has been working on its Self-Driving Car Project since 2009, described the Handoff Problem in https://www.google.com/selfdrivingcar/files/reports/report-1015.pdf. "People trust technology very quickly once they see it works. As a result, it’s difficult for them to dip in and out of the task of driving when they are encouraged to switch off and relax,” said the report. “There’s also the challenge of context—once you take back control, do you have enough understanding of what’s going on around the vehicle to make the right decision?"
 
  • Like
Likes mheslep
  • #88
17 seconds is a long time, but 3 seconds is still too long and not surprising. Situational awareness requires history. A simple action like stomping on a brake can fix a lot of problems, but if the car is, for example, having trouble negotiating a curve, you have to get the feel of the handling before you can figure out how much steering to apply.
 
  • #89
Regaining Control
Research shows that humans are notoriously bad at re-engaging with complex tasks after their attention has been allowed to wander. According to a 2015 NHTSA study (PDF), it took test subjects anywhere from three to 17 seconds to regain control of a semi-autonomous vehicle when alerted that the car was no longer under the computer's control. ...

This is what’s known by researchers as the “Handoff Problem.” Google, which has been working on its Self-Driving Car Project since 2009, described the Handoff Problem in https://www.google.com/selfdrivingcar/files/reports/report-1015.pdf. "People trust technology very quickly once they see it works. As a result, it’s difficult for them to dip in and out of the task of driving when they are encouraged to switch off and relax,” said the report. “There’s also the challenge of context—once you take back control, do you have enough understanding of what’s going on around the vehicle to make the right decision?"

Thanks NSA. Of course. Of course. This is not a matter of dismissing the owners manual; it is simply impossible to change the attention behaviour of human beings as a group. If it were AVs would be moot.

Tesla's assertion that the driver must pay attention in autopilot mode, else accidents are the driver's fault, is rediculous on its face, akin to encouraging drivers to cruise around with no safety harness on, but to continually brace themselves in case of the one ton forces on impact. If you go out the windshield, you just didn't brace hard enough. Now stop bothering us, we're inventing the future.
 
  • #91
US-27A is a four-lane highway with a posted speed limit of 65 mph.
[...]
Tesla system performance data downloaded from the car indicated that vehicle speed just prior to impact was 74 mph.
How did that happen?
 
  • #92
mfb said:
How did that happen?
The Tesla autopilot does not pick the speed; the driver does.
 
  • #93
Odd. What happens if the driver sets it to 65 mph and then the speed limit gets lowered at some point of the road?
 
  • #94
mfb said:
Odd. What happens if the driver sets it to 65 mph and then the speed limit gets lowered at some point of the road?
I don't think the car responds. I read through some descriptions a few weeks ago and the actual capabilities are well short of what the hype had implied to me.
 
  • #95
Then drivers have to look for speed limits, which means they have to pay attention? scnr
 
  • #96
mfb said:
Then drivers have to look for speed limits, which means they have to pay attention? scnr
They are supposed to, yes.
 
  • #97
I'm also curious about how the driver's desired speed coordinates with the autopilot settings, but I doubt speeding (+9 mph in a 65 mph zone) was relevant to this accident. Tesla says the driver never hit the brakes anyway making the accident unavoidable by the driver at any reasonable speed, and this particular accident was not one where the vehicle came to a stop at impact.
 
  • #98
mheslep said:
I'm also curious about how the driver's desired speed coordinates with the autopilot settings, but I doubt speeding (+9 mph in a 65 mph zone) was relevant to this accident. Tesla says the driver never hit the brakes anyway making the accident unavoidable by the driver at any reasonable speed, and this particular accident was not one where the vehicle came to a stop at impact.
I've read several adaptive cruise control specs and recall that there is generally an upper limit to the setpoint.

I think what mfb is driving at though is that this is another reason why the driver has to pay attention. And I agree, particularly for the the of road in this case: a 4 lane local divided highway will often take you through towns where the speed limit drops to 35. If the driver doesn't pay attention, the car would speed right through.
 
  • Like
Likes mheslep
  • #99
There's something fundamentally wrong with enabling idiocy at the wheel .

Too much automation. That's how you get pilots who cannot land an airliner when autopilot gets confused.

A lane hold makes sense, as does a wing leveler, but i'd have a time limit on it and lock out simultaneous speed hold.

Try autopilot on this road at 74mph.
upload_2016-7-27_11-51-59.png
 
Last edited:
  • #100
russ_watters said:
I've read several adaptive cruise control specs and recall that there is generally an upper limit to the setpoint.

I think what mfb is driving at though is that this is another reason why the driver has to pay attention. And I agree, particularly for the the of road in this case: a 4 lane local divided highway will often take you through towns where the speed limit drops to 35. If the driver doesn't pay attention, the car would speed right through.
Apparently so for this version of the Tesla, though I don't know the details. Autonomous vehicles theoretically can use GPS and road data to know where speed changes occur (which would always have some disagreement with reality), and then the ability also exists for the vision system to "read" posted speed signs.

Video from Tesla's autonomous software vendor (until yesterday):



This too can have real world limitations that wouldn't catastrophically trip up people.

road-traffic-speed-limit-sign-obscured-by-hedge-cpmpde.jpg
 
Last edited:
  • #101
jim hardy said:
Too much automation.
I think misused, not too much. No need to start a Luddite attack on ATM's; they work fine at all hours and don't run over people.
 
Last edited:
  • Like
Likes nsaspook, OCR and jim hardy
  • #102
jim hardy said:
Try autopilot on this road at 74mph.
The car slows down automatically if curves (or other traffic) make it necessary.

Road signs that are barely visible are a problem for humans as well - here online databases can be even better than human drivers.
 
  • #103
Humans though excel at making some identification in the presence of insufficient or hidden information, relevant to machines. They can often identify a barely visible road sign as a barely visible road sign, distinguish it from a plastic bag in the bushes, and act accordingly.
 
  • #104
On a positive note about the Tesla Autopilot - Man says Tesla Autopilot drove him to the hospital, saved his life
https://www.yahoo.com/finance/news/man-says-tesla-autopilot-drove-191549779.html

The system has come under fire after it was involved in a fatal Florida crash in May, but Neally told online magazine Slate that Autopilot drove him 20 miles down a freeway to a hospital, while Neally suffered a potentially fatal blood vessel blockage in his lung, known as a pulmonary embolism. The hospital was right off the freeway exit, and Neally was able to steer the car the last few meters and check himself into the emergency room, the report said.

Tesla's Autopilot technology has been cited in both the May crash, and a second non-fatal crash in Montana in June. Both the National Highway Transportation Safety Commission and the National Traffic Safety Board have investigated the Florida crash, and the Securities and Exchange Commission reportedly looked into whether Tesla broke securities law by failing to disclose information about the May accident before an equity offering.

A group of researchers from the University of South Carolina, China's Zhejiang University and the Chinese security firm Qihoo 360 apparently figured out how to hack into the Autopilot system and jam the radar to prevent it from seeing an object in front of it.
 
  • #105
Nissian ProPILOT self driving Chair

 
  • Like
Likes mfb

Similar threads

  • General Discussion
Replies
28
Views
2K
Replies
293
Views
18K
Replies
22
Views
1K
  • Introductory Physics Homework Help
Replies
9
Views
3K
  • General Engineering
Replies
24
Views
3K
  • Mechanics
Replies
1
Views
3K
  • General Engineering
Replies
19
Views
10K
Replies
7
Views
2K
  • Introductory Physics Homework Help
Replies
2
Views
5K
  • Classical Physics
Replies
9
Views
6K
Back
Top