Had an interesting course on Human Error, and the below article I found really to be an eye opener. It's a bit of a read, but worth it to give a deeper view for "finger pointers" to consider. its a document from a human factors speciallist in three parts:
Thoughts on the New View of Human Error Part I: Do Bad Apples Exist?
by Heather Parker, Human Factors Specialist, System Safety, Civil Aviation, Transport Canada
The following article is the first of a three-part series describing some aspects of the “new view” of human error (Dekker, 2002). This “new view” was introduced to you in the previous issue of the Aviation Safety Letter (ASL) with an interview by Sidney Dekker. The three-part series will address the following topics:
Thoughts on the New View of Human Error Part I: Do Bad Apples Exist?
Thoughts on the New View of Human Error Part II: Hindsight Bias
Thoughts on the New View of Human Error Part III: “New View” Accounts of Human Error
http://www.tc.gc.ca/CivilAviation/publications/tp185/4-06/Pre-flight.htm#HumanError
Before debating if bad apples exist, it is important to understand what is meant by the term “bad apple.” Dekker (2002) explains the bad apple theory as follows: “complex systems would be fine, were it not for the erratic behaviour of some unreliable people (bad apples) in it, human errors cause accidents—humans are the dominant contributor to more than two-thirds of them, failures come as unpleasant surprises—they are unexpected and do not belong in the system—failures are introduced to the system only through the inherent unreliability of people.”
The application of the bad apple theory, as described above by Dekker (2002) makes great, profitable news, and it is also very simple to understand. If the operational errors are attributable to poor or lazy operational performance, then the remedy is straightforward—identify the individuals, take away their licences, and put the evil-doers behind bars. The problem with this view is that most operators (pilots, mechanics, air traffic controllers, etc.) are highly competent and do their jobs well. Punishment for wrongdoing is not a deterrent when the actions of the operators involved were actually examples of “right-doing”—the operators were acting in the best interests of those charged to their care, but made an “honest mistake” in the process; this is the case in many operational accidents.
Can perfect pilots and perfect AMEs function in an imperfect system?
This view is a more complex view of how humans are involved in accidents. If the operational errors are attributable to highly competent operational performance, how do we explain the outcome and how do we remedy the situation? This is the crux of the complex problem—the operational error is not necessarily attributable to the operational performance of the human component of the system—rather the operational error is attributable to, or emerges from, the performance of the system as a whole.
The consequences of an accident in safety-critical systems can be death and/or injury to the participants (passengers, etc.). Society demands operators be superhuman and infallible, given the responsibility they hold. Society compensates and cultures operators in a way that demands they perform without error. This is an impossibility—humans, doctors, lawyers, pilots, mechanics, and so on, are fallible. It should be the safety-critical industry’s goal to learn from mistakes, rather than to punish mistakes, because the only way to prevent mistakes from recurring is to learn from them and improve the system. Punishing mistakes only serves to strengthen the old view of human error; preventing true understanding of the complexity of the system and possible routes for building resilience to future mistakes.
To learn from the mistakes of others, accident and incident investigations should seek to investigate how people’s assessments and actions would have made sense at the time, given the circumstances that surrounded them (Dekker, 2002). Once it is understood why their actions made sense, only then can explanations of the human–technology–environment relationships be discussed, and possible means of preventing recurrence can be developed. This approach requires the belief that it is more advantageous to safety if learning is the ultimate result of an investigation, rather than punishment.
In the majority of accidents, good people were doing their best to do a good job within an imperfect system. Pilots, mechanics, air traffic controllers, doctors, engineers, etc., must pass rigorous work requirements. Additionally, they receive extensive training and have extensive systems to support their work. Furthermore, most of these people are directly affected by their own actions, for example, a pilot is onboard the aircraft they are flying. This infrastructure limits the accessibility of these jobs to competent and cognisant individuals. Labelling and reprimanding these individuals as bad apples when honest mistakes are made will only make the system more hazardous. By approaching these situations with the goal of learning from the experience of others, system improvements are possible. Superficially, this way ahead may seem like what the aviation industry has been doing for the past twenty years. However, more often than not, we have only used different bad apple labels, such as complacent, inattentive, distracted, unaware, to name a few; labels that only seek to punish the human component of the system. Investigations into incidents and accidents must seek to understand why the operator’s actions made sense at the time, given the situation, if the human performance is to be explained in context and an understanding of the underlying factors that need reform are to be identified. This is much harder to do than anticipated.
In Part II, the “hindsight bias” will be addressed; a bias that often affects investigators. Simply put, hindsight means being able to look back, from the outside, on a sequence of events that lead to an outcome, and letting the outcome bias one’s view of the events, actions and conditions experienced by the humans involved in the outcome (Dekker, 2002). In Part III, we will explore how to write accounts of human performance following the “new view” of human error.
Part II:
Hindsight Bias
Have you ever pushed on a door that needed to be pulled, or pulled on a door that needed to be pushed—despite signage that indicated to you what action was required? Now consider this same situation during a fire, with smoke hampering your sight and breathing. Why did you not know which way to move the door? There was a sign; you’ve been through the door before. Why would you not be able to move the door? Imagine that because of the problem moving the door, you inhaled too much smoke and were hospitalized for a few days. During your stay in the hospital, an accident investigator visits you. During the interview, the investigator concludes you must have been distracted, such that you did not pay attention to the signage on the door, and that due to your experience with the door, he cannot understand why you did not move the door the right way. Finally, he concludes there is nothing wrong with the door; that rather, it was your unexplainable, poor behaviour that was wrong. It was your fault.
The investigator in this example suffered from the hindsight bias. With a full view of your actions and the events, he can see, after the fact, what information you should have paid attention to and what experience you should have drawn from. He is looking at the scenario from outside the situation, with full knowledge of the outcome. Hindsight means being able to look back, from the outside, on a sequence of events that lead to an outcome you already know about; it gives you almost unlimited access to the true nature of the situation that surrounded people at the time; it also allows you to pinpoint what people missed and shouldn’t have missed; what they didn’t do but should have done (Dekker, 2002).
Thinking more about the case above, put yourself inside the situation and try to understand why you had difficulty exiting. In this particular case, the door needed to be pulled to exit because it was an internal hallway door. Despite a sign indicating the need to pull the door open (likely put there after the door was installed) the handles of the door were designed to be pushed—a horizontal bar across the middle of the door. Additionally, in a normal situation, the doors are kept open by doorstops to facilitate the flow of people; so you rarely have to move the door in your normal routine. In this particular case, it was an emergency situation, smoke reduced your visibility and it is likely you were somewhat agitated due to the real emergency. When looking at the sequence of actions and events from inside the situation, we can explain why you had difficulty exiting safely: a) the design of the door, b) the practice of keeping the fire doors open with doorstops, c) the reduced visibility, and d) the real emergency, are all contributing and underlying factors that help us understand why difficulty was encountered.
According to Dekker (2002), hindsight can bias an investigation towards conclusions that the investigator now knows (given the outcome) that were important, and as a result, the investigator may assess people’s decisions and actions mainly in light of their failure to pick up the information critical to preventing the outcome. When affected by hindsight bias, an investigator looks at a sequence of events from outside the situation with full knowledge of the events and actions and their relationship to the outcome (Dekker, 2002).
The first step in mitigating the hindsight bias is to work towards the goal of learning from the experience of others to prevent recurrence. When the goal is to learn from an investigation, understanding and explanation is sought. Dekker (2002) recommends taking the perspective from “inside the tunnel,” the point of view of people in the unfolding situation. The investigator must guard him/herself against mixing his/her reality with the reality of the people being investigated (Dekker, 2002). A quote from one investigator in a high-profile accident investigation states: “…I have attempted at all times to remind myself of the dangers of using the powerful beam of hindsight to illuminate the situations revealed in the evidence. Hindsight also possesses a lens which can distort and can therefore present a misleading picture: it has to be avoided if fairness and accuracy of judgment is to be sought.” (Hidden, 1989)
Additionally, when writing the investigation report, any conclusions that could be interpreted as coming from hindsight must be supported by analysis and data; a reader must be able to trace through the report how the investigator came to the conclusions. In another high-profile accident, another investigator emphatically asked: “Given all of the training, experience, safeguards, redundant sophisticated electronic and technical equipment and the relatively benign conditions at the time, how in the world could such an accident happen?” (Snook, 2000). To mitigate the tendency to view the events with hindsight, this investigator ensured all accounts in his report clearly stated the goal of the analyses: to understand why people made the assessments or decisions they made—why these assessments of decisions would have made sense from the point of view of the people inside the situation. Learning and subsequent prevention or mitigation activities are the ultimate goals of accident investigation—having agreement from all stakeholders on this goal will go a long way to mitigating the hindsight bias.
Dekker, S., The Field Guide to Human Error Investigations, Ashgate, England, 2002.
Dekker, S., The Field Guide to Understanding Human Error, Ashgate, England, 2006.
Hidden, A., Investigation into the Clapham Junction Railway Accident, Her Majesty’s Stationery Office, London, England, 1989.
Snook, S. A., Friendly Fire: The Accidental Shootdown of U.S. Black Hawks over Northern Iraq, Princeton University Press, New Jersey, 2000.