Menu

Fingerprints Part 2

Last month we talked about personal fingerprints of individual error patterns and how we hide our own mistakes deep inside in some pretty sacred places we rarely visit. It’s even more rare that we open up and share this with anyone else. But in order to understand our own error patterns, we have to acknowledge our tendencies, and that is the first step to develop safer attitudes and behaviors. 

All the decisions we make in everything we do are choices. Our error fingerprints identify how we choose from among all the knowledge options we have studied so hard to store in our heads.  Safety experts recommend that a good place to start to fingerprint yourself and your unique error tendencies is to keep a brief written diary of errors, near misses, and mistakes throughout the day or at least over the course of your flights. You can even come up with a little form to fill out listing the conditions of the flight, time, conditions of yourself, and the nature of the mistake or near mistake. No doubt it’s painful, but very helpful and insightful.

I have kept a little pad around during my daily routines and I see some obvious patterns in my errors, such as driving over the speed limit, following too closely to the guy in front of me, and sometimes pushing through those yellow lights that we should stop for. Not major stuff, but something that sure could lead me into a wreck. By keeping a record and seeing my own error trends and just how often I do those things, I made myself more aware of my error tendencies and quit doing it. I’m a better driver and probably reduced the chances I’ll get hurt or hurt someone else in a car accident.

In the air, I put a blank sheet of paper on my knee clipboard to track my mistakes and I found it didn’t stay blank for long. By tracking my flights for a few months, a pattern of two errors I tend to make became clear. The first is pretty minor but it’s really maddening. When I get a frequency change from ATC, I enter it into the standby box or the second radio but forget to hit the flip-flop button and change radios before I make the new call. I know I’m not the only one because you hear it all the time when ATC issues a frequency change and the pilot announces his call right back at the original controller, who has to say to the pilot, “You’re still on center frequency, switch to …” Another “com” thing I’ve heard broadcast out over the radios are those famous “greetings from the flight deck” announcements that happen when someone up front forgets to switch from the coms radio to the intercom, and that gets a little admonition from ATC, “Captain, you’re still on center frequency.” A chatty pilot can tie up the frequency for a while with those broadcasts about seat belts, tray tables and bladder control, and nobody else can get their instructions from ATC until it’s over.

I’m not alone with this little problem, but I’m not exactly sure why I, and others, do it so often. For me I think that I concentrate so hard on remembering the new frequency and getting it entered, and I guess I’m so happy that I got it right, that I just don’t complete the steps of actually making the radio change. But keeping a log when I have done it has given some insight into my little error pattern and maybe even more important, insights into some other risk factors that might be associated with it. I have found that generally, the later in the flight, the more likely I am to make this mistake. Might this be linked to other mistakes that I am prone to make after this has happened?

The answer is sobering when you consider the other things that happen after a long flight, like shooting an approach and landing. A quick review of NTSB statistics shows that 65% of plane crashes occur during the last eight minutes of the planned flight. Obviously, that’s when the crew is maneuvering for the approach followed by the landing phases of flight. It’s not just that landing is the most technically difficult piloting skill; it comes at the end of the flight when you’re already tired. Tracking a small error that I make has given me a warning that I am entering a risky phase of my own brain function and of the flight. This kind of self-awareness regarding attitudes and behaviors is just the factor that the KABO training methods I talked about last month address. This has led me to be extra vigilant on all aspects of my piloting, especially as the trip lengthens, since the really dangerous time in the flight is coming. There is a whole body of research in these areas called “Error Producing Conditions,” which are times when something bad is likely to happen. This was researched for the nuclear power industry by a British scientist named J.C. Williams; but much more on this to come in upcoming articles.

Another one of those swirls in my personal error fingerprints I noticed in my little error log is my inability to move on from a poor choice to plan ‘B’ that might be a safer alternative. This is something I talked about a few months ago in the article that was titled “Those Lyin’ Eyes and is known as “plan continuation bias.” It showed up as a trend that “I have to push on” to landing the airplane after a less than perfect approach. I’ve heard a lot of CFIs say that the best predictor of a bad landing is a bad approach, and they’re right about that. I think I’ve broken the habit but it’s taken me a while to drill into my head that “clear to land” doesn’t necessarily mean that you have to land.

This happened to me a while back going into a little GA field in Tampa. Peter O. Knight Airport (KTPF) has a short runway sticking out on a peninsula just south of downtown. I’m not familiar with that airspace and wanted to get down and get slow anticipating the short runway, but Tampa approach kept me at 2,500 feet to clear a line of cell phone towers on the east side of the bay before they let me descend. I was in and out some puffy clouds and when I cleared the towers and broke out I was looking straight ahead, right at MacDill AFB, with its nice long tempting runway lined up on the same heading as Peter O. Knight. The air base is right across from KTPF, just a short hop across the Tampa Bay inlet. The controller politely drew my attention to the fact that it looked like I was aiming at the Air Force base. He was right, I was looking at the wrong runway, but of course I said no, that I was just squaring it off for a north landing at Peter O. Knight—yeah, right! By the time I reoriented my SA to the right airport I was too high, too fast, and almost past the runway at Knight and had to force that risky turn to final and try to get down, slow down and land. As predicted, I made a horrible landing that I had forced out of a terrible approach. I wrote it down on my little paper and thought about what I did wrong and what I should do to avoid falling into that 65% statistic of accidents on approach and landing. By keeping a log of the times I forced that infamous “turn to final,” I realized my own error pattern and I think I have broken myself of that habit. Now, if it’s not a completely stable approach, flaps, gear, speeds all set by 1,000 ft AGL, I force myself to go to plan B, go around and just don’t land from that approach.  

Tracking my own errors has helped me realize that my mistakes are not usually knowledge based; they are decision based. As painful as it might be, get your own personal fingerprints of your errors with a log of mistakes you make or things that you know you could have done better. This way you will never look back at an accident or an incident and have to kick yourself with that famous after incident thought, “What was I thinking!?!” You weren’t thinking and that’s what went wrong.  The past is the best prediction of the future, and your own error tendencies and trends predict your safety. Identify those error trends before something really bad happens, and predict a safe landing for yourself and your passengers every time. 

Kenneth Stahl, MD, FACS
Kenneth Stahl, MD, FACS is an expert in principles of aviation safety and has adapted those lessons to healthcare and industry for maximizing patient safety and minimizing human error. He also writes and teaches pilot and patient safety principles and error avoidance. He is triple board-certified in cardiac surgery, trauma surgery/surgical critical care and general surgery. Dr. Stahl holds an active ATP certification and a 25-year member of the AOPA with thousands of hours as pilot in command in multiple airframes. He serves on the AOPA Board of Aviation Medical Advisors and is a published author with numerous peer reviewed journal and medical textbook contributions. Dr. Stahl practices surgery and is active in writing and industry consulting. He can be reached at [email protected].

Related Articles