News First fatal accident involving a car in self driving mode

Click For Summary
The first fatal accident involving a self-driving Tesla occurred when a tractor-trailer made a left turn in front of the vehicle, which failed to brake. Concerns were raised about the software's ability to detect obstacles, particularly those without a ground connection, leading to questions about its reliability in various driving scenarios. Despite the incident, statistics suggest that Tesla's autopilot may still have a lower accident rate compared to human drivers. The discussion highlighted the need for better oversight and regulation of autonomous vehicles, as well as the ethical implications of allowing such technology to operate without adequate human monitoring. Overall, the incident underscores the challenges and risks associated with the early deployment of self-driving technology.
  • #91
US-27A is a four-lane highway with a posted speed limit of 65 mph.
[...]
Tesla system performance data downloaded from the car indicated that vehicle speed just prior to impact was 74 mph.
How did that happen?
 
Physics news on Phys.org
  • #92
mfb said:
How did that happen?
The Tesla autopilot does not pick the speed; the driver does.
 
  • #93
Odd. What happens if the driver sets it to 65 mph and then the speed limit gets lowered at some point of the road?
 
  • #94
mfb said:
Odd. What happens if the driver sets it to 65 mph and then the speed limit gets lowered at some point of the road?
I don't think the car responds. I read through some descriptions a few weeks ago and the actual capabilities are well short of what the hype had implied to me.
 
  • #95
Then drivers have to look for speed limits, which means they have to pay attention? scnr
 
  • #96
mfb said:
Then drivers have to look for speed limits, which means they have to pay attention? scnr
They are supposed to, yes.
 
  • #97
I'm also curious about how the driver's desired speed coordinates with the autopilot settings, but I doubt speeding (+9 mph in a 65 mph zone) was relevant to this accident. Tesla says the driver never hit the brakes anyway making the accident unavoidable by the driver at any reasonable speed, and this particular accident was not one where the vehicle came to a stop at impact.
 
  • #98
mheslep said:
I'm also curious about how the driver's desired speed coordinates with the autopilot settings, but I doubt speeding (+9 mph in a 65 mph zone) was relevant to this accident. Tesla says the driver never hit the brakes anyway making the accident unavoidable by the driver at any reasonable speed, and this particular accident was not one where the vehicle came to a stop at impact.
I've read several adaptive cruise control specs and recall that there is generally an upper limit to the setpoint.

I think what mfb is driving at though is that this is another reason why the driver has to pay attention. And I agree, particularly for the the of road in this case: a 4 lane local divided highway will often take you through towns where the speed limit drops to 35. If the driver doesn't pay attention, the car would speed right through.
 
  • Like
Likes mheslep
  • #99
There's something fundamentally wrong with enabling idiocy at the wheel .

Too much automation. That's how you get pilots who cannot land an airliner when autopilot gets confused.

A lane hold makes sense, as does a wing leveler, but i'd have a time limit on it and lock out simultaneous speed hold.

Try autopilot on this road at 74mph.
upload_2016-7-27_11-51-59.png
 
Last edited:
  • #100
russ_watters said:
I've read several adaptive cruise control specs and recall that there is generally an upper limit to the setpoint.

I think what mfb is driving at though is that this is another reason why the driver has to pay attention. And I agree, particularly for the the of road in this case: a 4 lane local divided highway will often take you through towns where the speed limit drops to 35. If the driver doesn't pay attention, the car would speed right through.
Apparently so for this version of the Tesla, though I don't know the details. Autonomous vehicles theoretically can use GPS and road data to know where speed changes occur (which would always have some disagreement with reality), and then the ability also exists for the vision system to "read" posted speed signs.

Video from Tesla's autonomous software vendor (until yesterday):



This too can have real world limitations that wouldn't catastrophically trip up people.

road-traffic-speed-limit-sign-obscured-by-hedge-cpmpde.jpg
 
Last edited:
  • #101
jim hardy said:
Too much automation.
I think misused, not too much. No need to start a Luddite attack on ATM's; they work fine at all hours and don't run over people.
 
Last edited:
  • Like
Likes nsaspook, OCR and jim hardy
  • #102
jim hardy said:
Try autopilot on this road at 74mph.
The car slows down automatically if curves (or other traffic) make it necessary.

Road signs that are barely visible are a problem for humans as well - here online databases can be even better than human drivers.
 
  • #103
Humans though excel at making some identification in the presence of insufficient or hidden information, relevant to machines. They can often identify a barely visible road sign as a barely visible road sign, distinguish it from a plastic bag in the bushes, and act accordingly.
 
  • #104
On a positive note about the Tesla Autopilot - Man says Tesla Autopilot drove him to the hospital, saved his life
https://www.yahoo.com/finance/news/man-says-tesla-autopilot-drove-191549779.html

The system has come under fire after it was involved in a fatal Florida crash in May, but Neally told online magazine Slate that Autopilot drove him 20 miles down a freeway to a hospital, while Neally suffered a potentially fatal blood vessel blockage in his lung, known as a pulmonary embolism. The hospital was right off the freeway exit, and Neally was able to steer the car the last few meters and check himself into the emergency room, the report said.

Tesla's Autopilot technology has been cited in both the May crash, and a second non-fatal crash in Montana in June. Both the National Highway Transportation Safety Commission and the National Traffic Safety Board have investigated the Florida crash, and the Securities and Exchange Commission reportedly looked into whether Tesla broke securities law by failing to disclose information about the May accident before an equity offering.

A group of researchers from the University of South Carolina, China's Zhejiang University and the Chinese security firm Qihoo 360 apparently figured out how to hack into the Autopilot system and jam the radar to prevent it from seeing an object in front of it.
 
  • #105
Nissian ProPILOT self driving Chair

 
  • Like
Likes mfb
  • #106
"Tesla requires the driver to do the monitoring"
-- A self-driving car is useless if you have to keep your hands on the wheel and constantly keep watch. In such case, I'd rather do the driving myself.
 
  • #107
eltodesukane said:
A self-driving car is useless if you have to keep your hands on the wheel and constantly keep watch. In such case, I'd rather do the driving myself.
I would prefer such a car over a non-self-driving car. In addition, it is just an intermediate step, and one that helps reaching the final goal of reliable and fully self-driving cars.
 
  • #108
eltodesukane said:
A self-driving car is useless if you have to keep your hands on the wheel and constantly keep watch. In such case, I'd rather do the driving myself.
I drive an automatic Civic so driving is not all that much fun. I'd still prefer a self driving car where you needed to still be attentive. It would save me gas mileage and I'm sure be safer. That is what I care about.
 
  • #109
Couple of researchers in the field commented on the arrival, or delay, of autonomous vehicles in this WSJ article.

Recent car mfn claims first:

...Ford Motor Co., BMW AG, Volvo Car Corp. and Lyft Inc. say they will produce fully autonomous vehicles by 2021 or sooner. Tesla MotorsInc. Chief Executive Elon Musk, rarely topped in hyperbole, says the technology will be here within 24 months...

CMU engineering professor
“These statements are aspirations, they’re not really reality,” says Raj Rajkumar, a professor of engineering at Carnegie Mellon University, who collaborates with General Motors Co. “The technology just isn’t there.…There’s still a long way to go before we can take the driver away from the driver’s seat.”

Duke engineering professor
Mary Cummings, a professor of mechanical, electrical and computer engineering at Duke University, says a fully autonomous car “operates by itself under all conditions, period.” She adds, “We’re a good 15 to 20 years out from that.”

Leader of the Google self-driving car project
Chris Urmson knows the field as well as anyone, having led the self-driving car project at Google parent Alphabet Inc. for more than seven years before departing in August. Last March, he told the SXSW conference that self-driving technology will arrive for some of us in a few years, and for the rest of us in 30. That is, it could arrive soon for very specific uses; but as a full-bore replacement for humans, it will take a long time.

Manager of Transportation Tech program at Berkeley
“I always remind people we’ve had driverless vehicles carrying people between terminals at an airport for 40 years,” says Steven Shladover, manager of the Partners for Advanced Transportation Technology program at the University of California, Berkeley. “But they’re operating in a very well protected right of way.

And the exception. CTO of Mobileye, the company behind Tesla's autonomous tech
Not everyone agrees, of course. Amnon Shashua, co-founder and chief technology officer of Mobileye, says that the problem of sensing and controlling in self-driving cars is mostly solved. Perfecting these systems won’t require scientific breakthroughs, he says—just many small improvements in the software, gleaned from watching humans drive in the real world.

“The ingredients exist; now it is a matter of engineering,” Mr. Shashua says.
 
  • Like
Likes Greg Bernhardt
  • #110
mheslep said:
Recent car mfn claims first:
What will speed up development is customer demand.
 
  • #111
Greg Bernhardt said:
What will speed up development is customer demand.
Well I think that depends on which opinions are right above. If truly autonomous, no driver vehicles have remaining fundamental problems like, say that of, controlled nuclear fusion, and is decades away as some of the experts above indicate, then demand won't magically reduce the wait. If the Mobileye CTO is right, then yes throwing money and engineers at the problem using existing approaches should speed up deployment.
 
  • #112
Fusion research could be faster with more funding as well. Autonomous cars get the funding.
 
  • #113
mfb said:
Fusion research could be faster with more funding as well. Autonomous cars get the funding.
Anything can be faster with more funding, but the point is that when the development timeline is is unknown because of technological hurdles, the compression is unknown too.
 
  • Like
Likes mheslep
  • #114
Well, if you know what you want to do, but need 10 years to get funding for it, it delays progress - by up to 10 years if it is the critical path. You don't know the exact critical path in advance, but you know that you delay progress if all projects need longer to get funding.
 
  • #115
mfb said:
Well, if you know what you want to do, but need 10 years to get funding for it, it delays progress - by up to 10 years if it is the critical path. You don't know the exact critical path in advance, but you know that you delay progress if all projects need longer to get funding.
I'm not really sure i see how that's relevant. I think the line of discussion we were on would go something like this:

If the expected development timeline is 10 years at the current funding, doubling funding could reduce it to 5: 10/2=5.

But unknown/2=unknown and never/2=never...and of course maybe unknown=never

For driverless cars, I suspect the technical problems are solvable, but it is not easy to predict how much effort is needed. For fusion, I'm still leaning toward unknown=never.
 
Last edited:
  • #117
Greg Bernhardt said:
That's interesting. The basic argument is that in order to control plasma you first have to understand it and in order to understand it you need better computer models. So we will soon be getting better computer models, so we'll be able to understand and control it.
The problem is that the logic is not reversible: understanding plasma does not guarantee that we will be able to control it. So I'm not as optomistic.
 
  • Like
Likes nsaspook
  • #118
russ_watters said:
So I'm not as optomistic.
The ultimate is a driver-less fusion car :)
 
  • #119
russ_watters said:
But unknown/2=unknown and never/2=never...and of course maybe unknown=never
I would define 2 as the "compression", so compression can be known even if the overall scale is unknown.

Anyway: There are cars driving around on streets without continuous human input already, and I'm sure their number will increase a lot, and at some point they won't need a backup driver any more.

Concerning fusion: Decades ago, when the first timelines were made, they also came with total investment estimates. We didn't match those investment estimates yet. ITER and DEMO are a clear road - if DEMO gets funded, it should allow reliable estimates on costs of commercial power plants.
 
  • #120
mfb said:
... ITER and DEMO are a clear road - if DEMO gets funded, it should allow reliable estimates on costs of commercial power plants.

Clear road? It's not yet established that ITER can hold its plasma for the targeted 5 minutes, that sufficient tritium is bred, that the first wall does not suffer too much damage. Then, funding DEMO means the cost of a DEMO like plant becomes known. But there's no indication that the cost of a commercial DEMO like plant would be at all feasible, where feasible means competitive with the cost of a fission plant.
 

Similar threads

  • · Replies 28 ·
Replies
28
Views
2K
  • · Replies 293 ·
10
Replies
293
Views
22K
  • · Replies 22 ·
Replies
22
Views
5K
  • · Replies 9 ·
Replies
9
Views
4K
  • · Replies 24 ·
Replies
24
Views
3K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 19 ·
Replies
19
Views
12K
Replies
7
Views
3K
Replies
1
Views
5K
  • · Replies 9 ·
Replies
9
Views
8K