Menu

Me and Robbie (Part Three)Me and Robbie Part Three

We left things on a bit of a sour note last month detailing some of the pitfalls and the dark side of cockpit automation. 

The amount of information we are presented by advanced cockpit automation is a little like drinking through a fire hose, and that leaves no doubt that the inexorable advances in automation have both incredible advantages but also a few limitations. One thing that has become obvious is that solving the automation conundrum requires a lot more from us than from the machinery. No matter how much Robbie’s capacities improve, we humans will always have the most advanced and powerful computer on the flight deck right up there in our own heads. There is just no debating the fact that pilots are still the most important element in flight safety. Situational awareness and human factors expert Mica Endsley agrees: “Although autonomy promises that systems will be able to perform many actions alone and no matter how capable, autonomous systems must still interact with humans who serve the central function as supervisory controllers, responsible for directing and overseeing their performance. Automation devices are only our teammates that we need to collaborate with.”

The dark side we identified last month is that, as the efficiency of automation improves and our trust and dependency increase, our problem-solving skills and pilot skills run real risks of degrading in fundamental ways. The brain studies of London taxi drivers I cited last month showed that this leads to real physical changes in the human brain that correlate with the loss of problem-solving skills. The solutions to this problem are multifaceted and there are a number of habits pilots need to practice to reverse this trend.

The first aspect of the solution deals with basic flying skills. As we turn over more of our flight duties to Robbie, pilots are at risk of flight skills disuse “atrophy” and can suffer from performance degradation at both ends of the workload spectrum. A pilot’s ability to perform the basic flying tasks necessary for safe flight diminishes with too little stimulation, as Robbie does all the flying and we get less time hand flying our airplanes. This can be seen in numerous NTSB accident reports, such as the tragic Air France flight 447 crash a decade ago when there was a weather emergency and the automation shut down. The simple but critical stick and rudder skills needed to fly the airplane out of the nighttime storms, even in a high time commercial airline pilot, were absent. A logical answer to this is to get a safety pilot and fly under the hood as often as you can and also click off Robbie and hand fly the airplane as much as possible, especially when you find some clouds to practice real instrument flying techniques to maintain good IMC flying skills and habits.

The second aspect to enhance our safe use of advanced automation deals with basic communication skills. The modern paradigm shift to automation means that communication skills need to be enhanced to include communicating with the automation itself. A recent FAA report states it this way: “It is important to develop a style of communication that makes sure that information is communicated across all team members of the latest human-autonomy teams.” In order to do this, pilots need to get a real understanding of what and how Robbie needs to be instructed and programmed to do what we want, and in return, learn how Robbie will answer. That means pilots need to understand all the pages of menus and messages Robbie will respond to, to advise us of what the automation is doing in all stages of flight. A breakdown in any stage of this interaction can be fatal, as we saw last month examining the crash of American flight 965 in the Andes.

Thirdly, basic crew resource management skills need to be strictly adhered to and updated to include the automation as a full crewmember. Dr. Kathleen Mosier commented on this to me in our recent conversation and pointed out that “in modern CRM that now incorporates human-automation teaming, we are counting on automation to be a good team member, but much automation lacks one or more of the essential characteristics of a good team player: transparency, observability, understandability, and predictability.” Digging deeper into the same FAA report I quoted from above, there are tips pilots can use to deal with this problem also: “As with human-human teams, human-autonomy teams must share a common plan to assure that actions are properly choreographed. This goes beyond the flight plan and includes all issues that would normally be discussed with the crew but may be harder to transmit to the automation. As automation rises to the level of a teammate, it is imperative that this new status be reflected in CRM curriculum. It is recommended that airlines review their CRM training and incorporate this new more powerful automation paradigm as a critical component. As with ‘human teams,’ it is important to maintain a common understanding of the task and team environment to keep everyone working toward a common goal. When new information becomes available it must be communicated or team members may find themselves working at cross-purposes. This can be especially challenging as the automation does not have the range of senses that the human have.”

 

The takeaway here is that CRM skills are even more important now than ever before since our new teams are made up of a couple of people and a couple of computers. We need to speak the same language with each other for safe flight operations, so if you have another “real” person in the cockpit with you, be sure to enlist their support when programming Robbie. Recognize that now CRM involves a crew of three resources and be sure all three of you are on the same page, checking each other and confirming that everyone’s (and everything’s) expectations match up. In the same report, the FAA goes on to comment, “Two people monitoring system programming and events doubles the chances that they will detect an anomaly, even if it was not detected by an automated decision aid, or recognize an inconsistent or inappropriate automated recommendation. Many cockpit procedures are designed on the premise that crewmembers will cross check system indicators as well as each other’s actions. Modern automation paradigms demand advanced CRM paradigms for safe operations.”


Another pilot skill that needs to evolve along with advances in automation is our favorite topic, good old situational awareness. Mica Endsley commented on this too and is supported by NTSB accident reviews of a number of the incidents we have already covered: “An automation conundrum exists in which, as more autonomy is added to a system, and its reliability and robustness increase, the lower the situational awareness of human operators and the less likely that they will be able to accurately and quickly assess a critical change when needed. The human-autonomy systems oversight model depends on human situational awareness, monitoring, and trust, which are all major challenges to achieving adequate SA with automated machinery and underlie this automation conundrum.”

The takeaway from these studies, as Dr. Mosier told me, is, “It’s just impossible to achieve situational awareness when you are not sure what the automation is doing and cannot predict what it will do next.” It is imperative for all of us to understand the automated systems of our airplane so there is never a time that the automation may command something that you haven’t already considered in building your SA picture.  Just like the three levels of SA we drilled down on recently, the new paradigm in building the big picture of SA must include what the automation is doing now and what it is likely to do in the near future.

The last requirement we have discovered is that pilots need to know how everything in the cockpit systems operates, where they each get power from, and how to unplug them or shut them down in a pinch. According to reports, a 737 Max flight just one day before the first crash was saved by a jump seat pilot who did just this and leaned over the FO to pull the circuit breaker when Robbie commanded an unwanted nose down pitch change. Also, get at least a working sense of the algorithms and programming logic so you know why it’s doing what it is doing. Know how to program every function and what every page of menu options offer. The FAA agrees with this: “Training is clearly one of the key components to reducing the accident rate of planes equipped with glass cockpits, and this study clearly demonstrates the life and death importance of appropriate training on these complex systems. We know that while many pilots have thousands of hours of experience with conventional flight instruments, that alone is just not enough to prepare them to safely operate airplanes equipped with advanced glass cockpit features.” The conflict we need to resolve is simple – flying our airplanes to the highest order of automation will keep our own expectation bias out of the decision-making process, but understanding and programming the automation accurately is essential.

With the last of these layers peeled back we find that the core of the issue comes back again to us, the pilots, and our training that needs to be carried out both in the airplane and in advanced simulation. Training for the new era of automation means we need to enhance our knowledge so we thoroughly understand our computerized instruments and never have to ask Robbie, “Why are you doing that?” Never abdicate your role as Pilot in Command to the automation as we saw on the ground in Iceland or in the air in Cali, and never hesitate to turn it off and hand fly the airplane until you can understand why the automation is taking you where it is. This also requires manufacturers to provide pilots with information to better manage unexpected automated commands and system failures. That is something that might have saved the 737 Max pilots and their passengers. The FAA has requested that pilots use the new Service Difficulty Reporting System to report malfunctions or defects with electronic flight, navigation, and control systems. Just like its big brother, the Aviation Safety Reporting System (ASRS), these data will serve to guide and improve further automation advances and training methods.

I’ve spent the last two years in this space talking about how the human brain functions, perceives information, and reaches decisions. I have laid out the case that understanding these workings will help us reach good decisions even in tough situations. Just like this background helps us think through problems and make the best decisions, having the same knowledge of our computerized flight instruments is critical for our safety in the air. Artificial intelligence is just that, artificial, but our intelligence is profound in our ability to learn new things, to be pliable, insightful, and adaptive and more than capable of figuring out issues that pop up unexpectedly. Never let these skills atrophy from disuse. Technology is here to stay, so prove Isaac Asimov was right after all: it is designed to help us. Since there is no way to change how machines “think,” a happy marriage is up to us. Don’t leave the ground with Robbie until the honeymoon is over and you know the relationship is solid.

Kenneth Stahl, MD, FACS

Kenneth Stahl, MD, FACS is an expert in adapting principles of aviation safety to the healthcare industry for patient safety. He also writes and teaches pilot safety principles and error avoidance. He is triple board-certified in cardiac surgery, trauma surgery, surgical critical care and general surgery and holds an active ATP certification and a 20-year member of the AOPA. Dr. Stahl practices surgery full-time and is Healthcare Division President for Convergent Performance, an industry leader in teamwork, checklist and accountability training and consulting. He can be reached at [email protected]

Related Articles