We’ve spent the last year on this site talking about how we make mistakes and what we can all do to avoid them. We’ve covered a lot of the hows and the whys, but we still haven’t gotten to the issues of the whens. Obviously, if we knew when bad things might happen it would be easier to use all those tools we’ve talked about to prevent them from happening in the first place. A lot of smart people have weighed in on how to figure out what’s coming our way, not the least of who are some of baseball’s most legendary philosophers. Although most people think the unflappable Yogi Berra said it, it was really Yogi’s Yankee teammate Casey Stengel who said, “You should never make predictions, especially if they’re about the future.” Cute, but is it really true? Is it so hard to predict the future? If we could actually predict what was about to happen it would make our task of preventing errors and mistakes a whole lot easier.
Personally, I don’t think it’s so hard to figure out what’s up ahead, and furthermore we do it all the time with common things such as, “If you drive when you’re drunk you’ll have an accident or end up in jail.” That’s a pretty reasonable prediction given our justifiable cultural and legal climate against hurting others with our own negligence. Or how about, “If you play with matches then you’re going to get burned.” Again, it’s another pretty good prediction about a very likely outcome in the future. Making reasonable predictions of what might happen soon is the final step of the big picture of Situational Awareness that we spent months on a while back. Let’s see how real experts try to figure out when errors will happen and see how predictions of the future will further help us become safer pilots.
Once again, we need to throw some science at this problem, even if it involves another mouthful of word salad from our pals with those high IQs. High reliability systems and error theorists have developed a “comprehensive model for evaluating the probability of a human error occurring throughout the completion of a specific task” – those of us down here on the ground would just call that a prediction. The methodology is known as “Human Error Assessment and Reduction Technique,” which lends itself to the convenient acronym HEART and is based largely on research done for the nuclear power industry by JC Williams (Williams JC. A data-based method for assessing and reducing human error to improve operational performance).
The HEART analysis retrospectively examines all the factors and circumstances that existed at the time of an adverse event or significant error and uses those circumstances to build a predictive model of when bad things are likely to occur. JC Williams has called such sets of circumstances “error-producing conditions” (EPCs). EPCs link traditional systems approaches to error avoidance with human factors analysis of individual performance, and by using some fancy mathematical modeling of probabilities these guys have derived formulas that show when a given set of circumstances exist, a bad mistake is pretty likely to follow.
Williams’ theory has been widely validated with studies of accident reports from many different industrial and transportation events. These reviews have found that very similar combinations of conditions almost always exist and were the contributing causes of the accidents. In this way, EPCs can be seen as a common thread of contributing factors to major incidents. Identifying our own “EPCs” can be used to establish very useful predictive tools to determine when we are more likely to make a mistake and something bad might follow. It’s certainly a better strategy to prevent errors and accidents rather than trying to dig yourself out of a hole once you’re in it, especially if you are a few miles up in the sky at the time. These sets of error condition models allow us to do just that. We do this intuitively in our everyday lives – so it’s not a magical crystal ball that drives that little voice in the back of our heads to warn us not to drive after drinking or play with matches. It’s just putting our common experiences of 2 + 2 together that when certain conditions exist the likely outcome will be bad.
So go for it; think back on some of the errors or near misses you’ve made recently and take a guess (no peeking ahead) at some of the top EPCs described in the HEART analyses of bad outcomes and serious incidents. Start simple; think about the times you are most likely to forget something or mess up a simple task that is usually no big deal. It probably happened when you were tired, and as we can see from JC Williams’ data, the #1 most common condition that leads to error is fatigue. It’s a simple and common sense observation and fits the fact pattern of almost any catastrophe you can think of. The Titanic hit an iceberg in the middle of the night, the Chernobyl nuclear power plant melted down in the middle of the night, the Exxon Valdez ran aground in early morning hours and the world’s worst industrial accident at the Union Carbide pesticide plant in Bhopal, India, spilling 30 tons of toxic gas that killed tens of thousands of nearby residents happened in the middle of the night, too. In my world of medicine, studies show that patients who have the same surgery or are admitted to an intensive care unit with the same diagnosis have a higher mortality at night than during the day. We just don’t think as clearly when we do complex tasks at night when our brains want to get some rest.
Williams’ model fits aviation accidents as well. FAA accident studies show that aircraft crews are almost 50 times more likely to have an accident when they are fatigued. This was a major factor implicated by the FAA investigation of the only recent US commercial accident, Continental Flight 3407, 11 winters ago in Buffalo, New York. This is such a big factor that airline pilots are not allowed to be on duty for more than 8 hours and if the flight lasts longer a relief pilot is on board to let the primary crew get some rest. But as GA pilots, we can fly late after a long day of work, or for that matter any time we want. If you are going to launch late at night, you have to know that it is one of those “error-producing conditions” and you have to maintain extra vigilance because you are in the danger zone.
So even if he didn’t come up with that zinger about predictions, it was Yogi who said, “The future ain’t what it used to be,” and that is something I do believe is true. This powerful model gives us the ability to anticipate when we are at increased risk for bad outcomes and maintain an especially high degree of vigilance during those time periods to avoid what might otherwise be inevitable. There’s just not as much mystery about the future anymore when we add a heightened awareness of error producing conditions to our bag of flying safety skills. We can predict the future, we can strive to avoid conditions that lead to errors, and we can guard against the errors that follow if we find ourselves in the middle of a set of these conditions. Next month we’ll talk about more EPCs and some countermeasures to avoid common error traps.