Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

News First fatal accident involving a car in self driving mode

  1. Jul 1, 2016 #1

    Borg

    User Avatar
    Gold Member

    It had to happen eventually. The world's first fatality while a car was driving itself.

    Self-Driving Tesla Was Involved in Fatal Crash

    If the car can't see a semi turning in front of it, you have to wonder if it can see a motorcycle or even other cars.
     
  2. jcsd
  3. Jul 1, 2016 #2

    mfb

    User Avatar
    2016 Award

    Staff: Mentor

    And it was clearly a problem of the software: it was programmed to ignore things that only have components "in the air" without a ground connection (e.g. overhead road signs. Source: Musk).

    On the other hand, considering the numbers from Tesla: 130 million miles driven by the autopilot, on average one fatal accident every 94 million miles in the US. 1 fatal accident happened - still a bit below average, although we would need to split that up by road category to get a better estimate.
     
  4. Jul 1, 2016 #3

    Borg

    User Avatar
    Gold Member

    I was thinking that might be the case where it was essentially looking under the truck's trailer. I guess that it saw the cab properly but after the cab cleared from the path, decided that the trailer was a billboard.
     
  5. Jul 1, 2016 #4

    mheslep

    User Avatar
    Gold Member

    Very likely the accident rate with autopilot will be lower than under human driver. The autopilot will never, ever, tire, get distracted, become emotionally impaired or drive after some drinks.

    On the other hand accidents like this one will occur that the worst human driver would avoid. The vehicle will not show the least bit of extra caution with a car load of children, going through school zones, nor even attempt to slow down as it drives off a cliff at high speed because a sensor failed once in a 100 million miles.

    The thing is, I dont think we're programmed to accept those kinds of trade offs.
     
    Last edited: Jul 1, 2016
  6. Jul 1, 2016 #5

    mfb

    User Avatar
    2016 Award

    Staff: Mentor

    Airplanes do more and more via autopilot and fly-by-wire, there are automatic metro trains, and so on. And at least initially, we can have the combination of an autopilot plus a human driver monitoring what the car does.
     
  7. Jul 1, 2016 #6

    Ivan Seeking

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    Computers and software will get better at driving. But humans, if anything, are getting worse. The other day I was thinking about this and can easily see the time when the idea of driving a car would seem threatening to many people. The same goes for piloting aircrafts. I suspect we will adapt very quickly. I would wager that In less than a generation, the notion of driving cars and flying planes will seem as archaic as riding a horse; and perhaps serve only as cheap thrills for the daring.

    Also, there is no reason why computers can't be taught to recognize potential dangers like school zones. Heck, every school could have a beacon that warns all approaching traffic that children are present. But the car would already know that because of Google Maps. :D

    Bottom line: In its relative infancy, self driving technology is already besting humans or doing about as well.
     
  8. Jul 1, 2016 #7

    russ_watters

    User Avatar

    Staff: Mentor

    I agree and I'm utterly shocked that autonomous cars have been allowed with very little government oversight so far. We don't know how they are programmed to respond to certain dangers or what Sophie's choices they would make. We don't know what they can and can't (a truck!) see.

    Along a similar vein, I don't know if people remember, but there was a quite a little crisis in the late 1990s as airbags saved vast numbers of people, but killed a handful of children:
    http://www.nytimes.com/2001/08/30/us/child-air-bag-deaths-drop-parents-get-bulk-of-credit.html

    In our litigious society, people won't be very willing to accept this.
     
  9. Jul 1, 2016 #8

    russ_watters

    User Avatar

    Staff: Mentor

    Both of those are much, much simpler sensing and control problems than cars are.
     
  10. Jul 1, 2016 #9

    mheslep

    User Avatar
    Gold Member

    With regard to avoidance and consequence, the case of aircraft autopilot is closer to a traditional car cruise control than the autonomous vehicle software on the road. Passenger aircraft do not maneuver autonomously on the crowded tarmac.

    Autonomous vehicles are today on the road via this Tesla model, and as this accident shows, they are operating with no "human driver monitoring" required. I think the major auto companies have chosen a safer path, calling their autonomous software something like "super cruise control" , restricted to negative actions like avoiding impending accidents, or simple parking, while the driver is always in charge.
     
  11. Jul 2, 2016 #10

    mfb

    User Avatar
    2016 Award

    Staff: Mentor

    Airbags got adopted.
    Sure, and cars are the next step.
    Tesla requires the driver to do the monitoring. In reality many drivers don't, but can you blame Tesla for this?
     
  12. Jul 2, 2016 #11

    Dotini

    User Avatar
    Gold Member

    But in this case the Tesla didn't have a driver, it had a passenger supposedly watching a movie. Was the movie playing on the Tesla's screen? You live by technology but you also die by technology, especially if you become a lab rat along the way.
     
  13. Jul 2, 2016 #12

    jim hardy

    User Avatar
    Science Advisor
    Gold Member
    2016 Award

    see my signature.

    What was the driver doing ? It is reported he was watching Harry Potter.
    http://sanfrancisco.cbslocal.com/2016/07/01/driver-in-fatal-tesla-crash-had-history-of-speeding/
    Florida made a law back when TV's got transistorized requiring that a TV screen in an automobile must not be located within the driver's sight. What a sensible idea ! Even as a teenager i could see that.

    Ffity years later my kids bought a SUV with a touchscreen smack dab in the middle of the dashboard with menu driven controls for airconditioner, radio, and other bells&whistles unimaginable to a luddite like me. There was even a telephone !.
    What a preposterous state of affairs. Airlines used to have a Flight Engineer to tend to the machinery so the pilot could fly the plane.


    And i'm flabbergasted by the so called "journalists" trying to blame everybody except the driver
    http://www.chron.com/business/technology/article/The-Latest-Tesla-crash-a-harbinger-of-industry-8335889.php [Broken]
    http://gizmodo.com/fatal-tesla-crash-proves-full-autonomy-is-the-only-solu-1782923424


    If you allow and even encourage drivers to be distracted by silly electronic doodads you are going to get distracted drivers and crashes.
    My answer to this excess automation is to require that any vehicle having an electronic display screen or keyboard within driver's view be manned by two persons , one to drive and the other to tend the electronics. Maybe a third person to make sure one of the other two is driving.

    Techno-narcissists who think up this silly stuff need a moment of introspection.

    old jim
     
    Last edited by a moderator: May 8, 2017
  14. Jul 2, 2016 #13

    mfb

    User Avatar
    2016 Award

    Staff: Mentor

    I don't know where the movie was playing, but playing a movie on a screen can also be for others in the car.
    If you play a movie in a regular car, and make a crash, it is clearly your fault. Why? Because you are supposed to pay attention to the road. See the similiarity?
     
  15. Jul 2, 2016 #14

    russ_watters

    User Avatar

    Staff: Mentor

    I'm not saying auto-driving cars won't or even that they shouldn't (I'm a big fan of all tech). I'm just saying they've been released too soon, with too little oversight and have a long and rocky road ahead of them.
    Yes. In the US, you can blame (sue) anyone for anything. People will.
     
  16. Jul 2, 2016 #15

    mfb

    User Avatar
    2016 Award

    Staff: Mentor

    What improves if they are implemented later?
    - while we don't have enough statistics, accident rates don't seem to go up from it
    - the software quality certainly profits from larger datasets
    There is not just the US, and we'll see how successful those attempts are if you ignored several warnings and didn't follow the instructions at all.
     
  17. Jul 2, 2016 #16

    russ_watters

    User Avatar

    Staff: Mentor

    Bug fixes (like recognizing a truck!)
    Regulations.
    Simulations.
    Training.

    You used the analogy of plane autopilots and fly-by-wire. Despite being an easier problem, it is a good analogy: pilots get trained how to use them and what their limitations are (though they sometimes fail to interact with them properly) and they are tested for a decade before being "released". And federal regulators are on top of them the entire time. What bothers me is that Tesla/Google are literally beta testing these features on the public like they would a new version of Chrome ( https://www.teslamotors.com/presskit/autopilot ). But when Chrome crashes, no one gets hurt. Would people really be ok if Boeing beta tested its planes on the public?
    I'm not suggesting they do. In fact, I agree they are almost certain to go down. One link I read described the legal problem this way: Today, 95% of car crashes are the driver's fault and 5% are the auto companys' fault. Auto-drive will reduce the 95% that are the drivers' fault and increase the 5% that are the auto companys' fault. The net result for the auto companies is likely to be an increase in liability due to the increase in their responsibility for crashes. That's going to be a difficult problem to deal with that I doubt people/companies have thought through and are ready for.
    Certainly. Some of that can be done with the sensors alone, and no autodrive (and I'm sure Tesla did that). And at least Google did a lot of its testing with unmanned cars or with cars that had a paid "test pilot" behind the wheel. But while it would certainly help improve the software faster to beta test an airliner in real-world conditions, carrying passengers, in real airports, it would be a really bad idea to do it.
    Yes, I think the best bet would be to beta test these cars in countries where people have less right to sue if the car malfunctions. :D

    And while I'm generally in favor of reducing this country's litigiousness, this is a case where I think there is clear negligence by the car companies. The very fact that they call it a "beta" test tells us they know the product is not ready for public use.
    People went to jail for not shutting off airbags that killed their kids. But car companies also got sued and lost lots and lots of money. I don't think the argument that you need to pay attention to your self driving car because it is flawed and creates new dangers will fly very well in court.
     
  18. Jul 2, 2016 #17
    I won't be an early adopter, but I am fully committed to the idea of fully automated cars. It will be the next big revolution once honed.
     
  19. Jul 2, 2016 #18

    russ_watters

    User Avatar

    Staff: Mentor

    I'm going to go a step further: I predict that when legislators and lawyers get a chance to process this accident, the party will be over for Tesla: they will be forced to remove the auto-drive feature until such time as they (the government, the insurance institute, Tesla) can get it under control.

    As for current owners of Teslas, hopefully the party is over for them too. No more playing Jenga and watching movies while your car drives itself. People need to take more seriously a beta test that can kill you if it goes wrong and I think now they will.

    Even worse, I wonder how Tesla software engineers feel right now. You know, the ones who have been going through the data and collecting the bugs to fix. I wonder if they knew about this bug. What do you call it when someone lets a known design flaw kill a person?
     
  20. Jul 2, 2016 #19

    jim hardy

    User Avatar
    Science Advisor
    Gold Member
    2016 Award

    One of life's lessons is you weigh risks against consequences.
    I regard whole computer industry as "flakey" for the reason you describe, lack of consequences foments irresponsibility..

    I fixed computers long enough to be very wary of them.
    I don't like the idea of them running nuke plants or automobiles without supervision.
    Indeed, federal law used to forbid placing any nuclear reactor under the control of "a programmable device" .


    They're already working on computer collision avoidance .. let's take that one step further -

    With the eye scan technology we have, that autopilot should have been aware of where the driver was looking. When his eyes strayed from the road it should have pulled over, parked the car
    and announced in that stern feminine GPS voice :
    "Your driving privilege is hereby revoked on grounds of inattention. When i sense a new buttprint in the drivers seat i will unlock the controls. "

    If you're going to automate, automate. :wink:

    Or, we could keep life simple..

    old jim
     
    Last edited: Jul 2, 2016
  21. Jul 2, 2016 #20

    mfb

    User Avatar
    2016 Award

    Staff: Mentor

    I think you misunderstood my question. For any given point in time, do you expect anything to be better if we delay implementation of the technology? Would Tesla's software at the end of 2016 be better if they wouldn't have drivers using it? Would regulations at the end of the year be better? Simulations? Anything else? Same question for 2017, 2018, ...

    I'm not comparing "self-driving features first introduced 2026" with "self-driving features first introduced 2016". Of course the first would have better software. But then you have to ignore the tens of thousands that die from manual driving in those 10 years in the US alone.

    I think the current accident will do a lot to improve the software, the regulations, and so on, reducing the number of accidents in the future.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted



Similar Discussions: First fatal accident involving a car in self driving mode
  1. A car that drives itself (Replies: 10)

  2. 3 Car Collision (Replies: 9)

Loading...