Workplace Situational Awareness: Stopping is always an option
April 13, 2010 by Jeff "Odie" EspenshipAs a USAF fighter pilot and commercial international airline pilot, I relate directly to human behaviors of employees who work in high risk and often dangerous work environments. Workers in manufacturing, engineering, utility, chemistry, construction, or hospitals are no different than aircraft pilots. These workers are highly trained professionals, striving to safely complete difficult and often dangerous tasks in a dynamic and changing environment. They handle sophisticated equipment while often fighting fatigue, interruptions and distractions. Amid this atmosphere, attention to detail is vital, complacency is a silent killer, and short cuts taken in the interest of saving time and staying on schedule become common. Deviant workplace behavior becomes normalized over time because “we’ve always gotten away with it.” Then one day, our deviant behavior catches up to us and disaster is the outcome. Thus, when job related incidents and accidents happen, frequently we must reflect inward. We can look no further than human behaviors as the root cause. Company leadership from all corporations can glean valuable human behavior lessons from aviation accidents. Why? Because no matter what job you do, we are all human and humans make mistakes. However, when accidents occur in aviation, there is a black box that records actual “on the job” conversations, actual worker (pilot) inputs, operator manipulation of vehicle controls and response, and possible equipment failures – all painstakingly scrutinized. It is not my place, nor my purpose here to judge anyone. As safety professionals, it is our job to learn from those who have gone before us, to pass along valuable lessons so that mishaps are not made in vain. I would like to draw attention to the recent aviation accident of Comair flight 5191 that departed from the wrong runway in Lexington KY on August 27, 2006. The mishap cost 49 people lost their lives. After a thorough investigation by the NTSB, board member Deborah Hersman suggested that there were numerous causes, nearly all of them human. She sums it up by saying ”...this accident has led us into the briar patch of human behavior.” Let’s step into the cockpit of Comair flt 5191. It’s early morning; it’s still dark out as the captain is taxiing the airplane to depart on rwy 22 (Blue line). Instead he mistakenly turned onto rwy 26, a rwy that is too short for the required takeoff roll. There are many lessons to learn. In this accident, I would like to focus on human behaviors in the workplace as it relates to Situational Awareness. Pilots refer to their “Situational Awareness” (also referred to as SA) as a measuring stick for their perception of actual reality. Problems arise when their actual reality or situation is revealed to be different than their perception of reality. In Sidney Dekker’s The Field Guide to Understanding Human Error, he says we combine our experience, education, and our intellect to create a mental picture of what is going on around us. Because of this, ones Situational Awareness highly individualized. So naturally, those who have the higher intellect, experience, and the most training usually get promoted into supervisory positions (aircraft captains). When doing our job, either alone or in a group, we rely on our SA to make work decisions, lead others, figure out how to proceed, and to project future outcomes of work actions. However, our SA is only as accurate as our perception of reality. There may be an error with our internal processing of information or external factors that affect our perceptions. The captain of the ill-fated Comair flight turned onto the wrong runway for takeoff. How could this happen with a highly experienced and trained captain at the tiller and a normally alert copilot sitting next to him? Aircraft manufactures install the nosewheel “tiller” only on the captains side, so he is the only one who can physically steer (taxi) the airplane when it’s on the ground. However, it is the responsibility of EVERYONE to verify where the airplane is going. Aviation, like many other businesses, relies on redundancies and backups for safety. Copilots (workers) can not afford to be complacent or “assume” the captain (supervisor) is always right during critical operations. Taxiing an airplane is considered critical, especially at busy airports Thus, the FAA mandates “sterile cockpit”. This is designed to eliminate unnecessary chatter during ground operations. It’s up to everyone to adhere to this rule, and it is ultimately the captain’s responsibility to enforce this rule when unnecessary chatter in the cockpit begins. In the Comair accident, NTSB staff member Joe Sedor identified nonpertinent “chatter between [Captain] Jeffrey Clay and [Copilot] James Polehinke as they prepared to taxi and take off. Comair has acknowledged some culpability as a result of the talk.” Sedor said the chatter “greatly affected the crew’s performance.” In any corporation where safety is paramount, when a supervisor or manager fails to enforce even the smallest of rules, it creates an on the job culture that runs substandard to established company policy rules. The workers will work within the norms of the job culture that is created by the job site leader. This on the job culture easily becomes “the way it’s done around here”. Consequently, if a stricter supervisor happens to show up, the job culture immediately changes, and the workers will perform within the regulation of the new job leader. Job culture created by leaders is probably the most powerful of all influences of human workplace behaviors; more than attitude, training, skill, and experience. When the on-the-job culture runs incongruent with the work rules established by the company, the on-the-job culture will have the most influence on work behaviors. By allowing the non-pertinent chatter in the cockpit, both the captain and the copilots Situational Awareness degraded, and they didn’t know it. Their perception of reality had them believing they were lining up on rwy 22, however in actual reality, they were lining up on the wrong, short, unlit rwy 26. Once we believe our perceptions of reality to be true, any clues that may surface later to indicate that our current situation is different than what we perceive it to be, we tend to rationalize away, or even ignore those clues. Worse, as our workload increases, or pressure mounts to complete tasks, this selective inattention to a potential problem becomes more pronounced. Our brain goes into a “load shedding” mode as we try to subconsciously “fill in the blanks.” We rationalize, even ignore each problem or clue in order to keep our perceptions congruent with our mistaken reality, though evidence and signs are mounting all around to suggest otherwise. Workers and supervisors need training to understand this phenomenon. Pay attention to those clues. It’s often the last hint we have before tragedy strikes. In the early morning As Comair flight 5191 began its takeoff roll down an unlit, darkened runway, the first hint that “something wasn’t right” came approximately 12 seconds into the takeoff roll. The first officer made the comment ”...is weird with no lights.” “Yeah,” confirmed the captain. No response again until over 15 seconds later, the captain made an exclamatory “Whoa!” as the end of the runway appeared. The sounds of the crash followed shortly thereafter. For 15 seconds, the crew of the ill-fated flight discounted the copilots rhetorical statement of the runway lights being off as “weird.” During the takeoff roll, the workload is high. Did they rationalize that since some of the airport was under construction, the runway lights were off? Did they ignore the requirement for needing runway lights? They both agreed that no runway lights at night was “weird,” yet they continued to work instead of stopping. Had they immediately stopped the work (discontinued the takeoff) in order to bring their situational awareness in line with actual reality, 49 people would probably be alive today. First, don’t break the “little rules.” It leads to braking more rules, more often. Second, Develop a healthy “questioning attitude” among workers. Third, a questioning attitude is only as good as the follow through. Stop the work, crosscheck, verify, thoroughly evaluate the problematic question. Never assume or rationalize. Only when everyone’s Situational Awareness is congruent with actual reality are we safe.