While trying to figure out who killed Evelyn in Stuart Turton’s twisty, convoluted murder mystery, The 7½ Deaths of Evelyn Hardcastle, Aiden Bishop, the main character, lamented, “Too little information and your blind, too much and you’re blinded.” 

That reminded me of a problem I had in college and medical school with my study habits. As I plowed through all the textbooks that I was supposed to learn, I just couldn’t control my urge to underline everything I read, since at least at the time, everything seemed so important. The problem is that if everything on the whole page is day-glow orange underlined, then nothing is really highlighted. My original intent was to help quickly zoom in on the most important details when I went back to review the material. It was a noble effort but I had defeated it because nothing was emphasized, everything looked the same, and I had just drowned myself in too much information, “TMI.”

Fast-forward a lot of years since medical school and I still haven’t shaken the TMI issue in my pilot and surgical lives. Except for the minor detail that one is on the ground and the other up in the sky, the modern glass cockpit is really very similar to my other world as a surgeon working in an intensive care unit (ICU) full of critically ill patients, all of whom are hooked up to wires and monitors that track and beep with every heart beat, breath and drip from their IVs. The ICU environment presents nurses and docs with that same TMI problem. Basic information is crucial, but some really important and potentially dangerous changes in a patient’s vital signs, and also in our airplane, can easily be swallowed up and lost in the general noisy chaos and informational deluge of those environments. With so much noise from all those alerts and alarms, really crucial data that might demand quick action in the air or in the ICU can hide in plain sight. Just like Aiden Bishop said, we’ve gone from a by gone era of flying “blind” (literally and figuratively) to our current state of having so much information that we’re blinded by it.

Our word “alarm” comes from the French à l’arme, which is translated “to your weapons.” As the words indicate, it’s a call for immediate action, to attack or defend, but too much of that kind of stress can really wear us out. Studies show that bombarding our senses with alerts and alarms leads to “alarm fatigue“ and that increases, not decreases, our workload. In the sky this brings up a crucial question for GA pilots: Have all the information and alerts we get bombarded with in our advanced glass cockpit made GA flying any safer? It’s an issue that has been debated for a decade in our pilot universe. In a report on fatalities in glass-equipped GA airplanes a few years back the FAA concluded that we’re not safer “because glass cockpits are complex and vary from aircraft to aircraft in function design and failure modes and pilots are not necessarily provided with all of the information they need.” Without any hesitancy the GA world, including the AOPA, fired back at the study with the counterargument that the study only reflected new technology that pilots needed more training and experience with to achieve the intended safety goals. That’s a fair criticism but might not be the only answer, since the problem is not unique to the GA world, or the surgical ICU either. This alarm and information overload problem extends right into the commercial heavy iron cockpit.

Consider a near disaster in the aviation world a few years back. The incident involved an all-engine-out landing of an Airbus 330, Air Transat Flight 236, in the Azores on August 24, 2001. The Portuguese Aviation Safety Board (GPIAAF) report states, “Four hours into the flight, the aircraft experienced unusual oil indications. Two hours later, a fuel system failure led to a full-blown emergency that was not evident to the crew until it was too late. Although all relevant data to avoid the emergency were available to the pilots from the aircraft computer systems, the design choices made about what to display and how to display it kept the pilots in the dark.” The information was there for the taking but the pilots missed it since it was disguised by the displays by all of the other data presented for them. Asef Degani wrote about this incident in The Journal of Cognitive Engineering and Decision Making, “Information Organization in the Airline Cockpit: Lessons from Air Transat Flight 236.” He concluded that very large amounts of data, all available for presentation and display, far exceed what we humans can absorb and comprehend. “Glass cockpit data are processed and commonly presented piecemeal, meaningful interrelations among individual bites of information are lost. Both problems lead to the risk of ‘drowning’ in disorganized data due to information overload as pilots begin to lose the ability to manage and comprehend the amount of data that’s provided on cockpit displays and onboard computers.” This is a lot like cockpit information systems problems we talked about in the articles a while back on automation, “Me and Robbie.”

The FAA and GA organizations are still debating the glass cockpit safety question, but they both would agree we need some fixes that involve changing our thinking about information management and information overload. The FAA said that rather than disorganized and unfocused piles of information, we really need only information for our specific flight, preferably with an emphasis on the most important data. C. G. Schuetz proposes a model of information management that summarizes and then combines data into bite-sized cubes of the most relevant information that he calls “data containers.” These are packets of information “arranged and presented to pilots in a hierarchically organized display along an information highway based on importance and relevance to current flight operations.” The AI gurus behind the glass panel designs would agree and have engineered some of this into our fancy glass panels. But it’s still up to us as pilots to organize, distill and more importantly filter all the data coming into our brains from modern cockpit displays. The information deluge needs to be controlled and consolidated into useful-sized chunks that keep flight information limited to the most critical and useful data accessible right in front of us when we really need to see it.

We live in a world where there is more and more information that contains less and less meaning. Way back in 1934 in his play The Rock, T. S. Eliot nailed it: “Where is the knowledge we have lost in all this information?” In all of our worlds, up in the air and down here on the ground studying for exams or taking care of sick ICU patients, information needs to contain knowledge, but it’s fleeting and easy to lose if you try to drink all of that information from a fire hose. Be smart, manage and prioritize your information and limit your intake to the critical data that you have the cognitive bandwidth to absorb so that important stuff can’t hide from you in plain sight.

Kenneth Stahl, MD, FACS
Kenneth Stahl, MD, FACS is an expert in principles of aviation safety and has adapted those lessons to healthcare and industry for maximizing patient safety and minimizing human error. He also writes and teaches pilot and patient safety principles and error avoidance. He is triple board-certified in cardiac surgery, trauma surgery/surgical critical care and general surgery. Dr. Stahl holds an active ATP certification and a 25-year member of the AOPA with thousands of hours as pilot in command in multiple airframes. He serves on the AOPA Board of Aviation Medical Advisors and is a published author with numerous peer reviewed journal and medical textbook contributions. Dr. Stahl practices surgery and is active in writing and industry consulting. He can be reached at [email protected].

Related Articles