Menu

New Years Predictions Part 2

As we saw in last month’s dive into JC Williams’ Error Producing Conditions (EPCs), fatigue puts us at risk of making mistakes in the air and on the ground, so I hope everyone got a good night’s sleep after all the partying on New Year’s Eve. 

Fatigue was a part of the bigger conversation about common sets of conditions that make errors more likely to occur, and it turns out to be the biggest risk of error in Williams’ analysis. We covered a number of aspects of how to counter risks of fatigue, degradation of decision making and skills, and how to give your brain the edge over effects of being tired a while back in the article on the subject. The reason for a discussion on this topic is that there are obvious benefits of knowing the conditions that produce errors and using that information to make predictions that can give us a heads-up when bad stuff is likely to happen. Being pre-warned about these dangers requires that we up our game to a level of extra vigilance when those sets of conditions exist so we can avoid mistakes. Putting all this into one catchy phrase, it would have to be, “If you can see ahead, you can plan ahead.” How cool would that be? It would show that old Casey Stengel had it wrong; predictions about the future aren’t that hard to do after all.

Even though Williams’ statistical study showed that fatigue is the number one predictor of human error, there certainly are plenty more. Take another minute and think back on some other times something didn’t go as you had planned and try to identify a few of your own EPCs.

What did you come up with? Compare your list to the next item on JC Williams’ list: “high-risk/low-frequency” events. Williams’ math shows that the risk of a fatal outcome when this condition exists is 17 times higher than when operating in more familiar and less risky environments. Just like all the bad things that happen in the middle of the night, this also fits with our own common sense experiences. High-risk/low-frequency events are the basis of all kinds of risk matrix assessments that have been around forever. Gordon Graham has been out in front of this for years and developed training methods that he teaches to fire jumpers and wild-land firefighters. His work is based on a simple matrix plotting risk or injury and even death on one axis vs. the frequency that high-risk operators have faced that set of conditions. Obviously, the less experience the operator has and the higher the risk, the worse the outcome.  In our world, think of what a GA pilot might face during a real flight.

Be honest with yourself; even as a high time pilot how many times have you flown a real live approach to minimums or near minimums on low IFR? How about into an airport in the mountains or city? We all fly under the hood for practice but it’s not the same as descending into the goo and having to be exactly on the glide slope to clear a few obstacles, picking up a little rime on the wings, bouncing around with zero visibility and not seeing anything until you get over the rabbit and then having to land on a short, slick runway. No matter how much time you have in the left seat, this is a hugely risky and infrequent event for any GA pilot. The most obvious question is why go into that airport? What about if it’s the only one around and you’ve run your tanks down too much to make it to the next closest one and try another approach? Doug Downey taught us one of the best ways to avoid falling into this trap, which we talked about in the article about his Bandit 650 flight. We talked about Doug’s takeaway lessons from his experiences in the articles and the follow-up podcast. He made the point that the best ways to deal with unfamiliar events are to imagine every bad scenario that has never happened to you, but might, live it in our head, and then practice how to fly out of it to safety.

That brings us to Williams’ next EPC, “time pressure,” and when that occurs the risk of a bad outcome goes up 11 fold. This one also makes a lot of sense and fits right into our common experiences. Just think about being really late for a very important event, running out of the house, letting the door lock behind you and having left your car keys inside. Of course, it usually happens when you are rushed and just don’t think to grab your keys as you sprint for the driveway. Consider EPC number 3 in perspective with the frequency/risk balance in his number 2 EPC, and that brings up another important point that Williams makes. Risks that EPCs portend don’t just add up together; they multiply with each other. So multiply the time pressure of running low on fuel with the high risk/low frequency of an approach in LIFR weather and it’s really easy to see a disaster in the making. I guess if you wanted to be a purist and calculate the risk according to Williams’ theory, it is 17 x 11 = 187 times more dangerous to be out of practice doing something risky when pressed for time and that’s probably not all that uncommon. It does explain the NTSB statistics we talked about a few months back that said 65% of aviation accidents happen in the last eight minutes of the flight.

The last two of Williams’ top five EPCs deal with decision-making biases. Those are topics we covered in these pages a number of different ways in the series on “Those Lyin’ Eyes.”  There are numerous biases that interfere with our perceptions, decisions, and even our position in space that can negatively impact flight safety. Each of these biases multiplies other risks by about 8-10 fold.  

There are dozens more EPCs, so what can we do to avoid falling into these common error traps? For a start, and just like we talked about identifying your own “Fingerprints of Error” in the last two articles, you can identify your own sets of error-producing conditions with a simple diary or log of your mistakes or near misses, and then in your debrief of the incident (you do debrief every flight even just to yourself when you fill out your log book, right?) make a chart of what happened and all the circumstances that existed at the time. Since common things occur commonly, the chances are that your incident will fit into one of the established patterns but it might not, and that is where the value of this little exercise can be found. This exercise will prove to you that seeing into the future is not all that hard after all, and to be able to make those predictions will allow you to be forewarned and on guard during risky periods of your flight. If you find yourself slipping into sets of circumstances where you know an error is more likely to happen, you need to ramp up your game, increase your vigilance, and avoid bad outcomes with your “pre-knowledge” of what lies just around the corner. It’s just like Doug Downey said when he flew his stealth jet, daring the airplane to fail him and knowing in advance how he would handle it when it actually did.

Kenneth Stahl, MD, FACS

Kenneth Stahl, MD, FACS is an expert in principles of aviation safety and has adapted those lessons to healthcare and industry for maximizing patient safety and minimizing human error. He also writes and teaches pilot and patient safety principles and error avoidance. He is triple board-certified in cardiac surgery, trauma surgery/surgical critical care and general surgery. Dr. Stahl holds an active ATP certification and a 25-year member of the AOPA with thousands of hours as pilot in command in multiple airframes. He serves on the AOPA Board of Aviation Medical Advisors and is a published author with numerous peer reviewed journal and medical textbook contributions. Dr. Stahl practices surgery and is active in writing and industry consulting. He can be reached at [email protected].
Topics: Pilot Health and Medical Certification

Related Articles