Those Lyin' Eyes (Part 2)

Expectation bias (EB), our tendency to believe our “lyin’ eyes” telling us what we want to be true and not what is actually true, can be a killer in an airplane. Last month we discussed EB in detail and cited some examples of just how dangerous it can be. 

Let’s look at each of these flights we talked about and see how the crews allowed their expectations to affect their decision process. There are important skills to learn from these examples and carry with you on every flight that will help you to recognize your own expectation bias and not fall prey to its powerful charms.

The first thing to keep in mind is simple—always remember that EB exists and it is most likely to grab you when you least expect it. The symptoms are so subtle and insidious that being constantly on guard and vigilant against the temptations of your own EB is crucial. We spent the last few articles talking about building the big picture of situational awareness (SA), but a critical part of your SA picture is to understand that it’s a prediction and not reality. If you let it, EB will totally corrupt your SA by getting your brain stuck on previous information that you used to build you old SA picture and not reassessing new information that might change everything. Anticipation is a good thing, but always keep in mind that your SA is based on what you believe is coming in the near future. It’s your best guess supported by information you have collected but needs to be checked, updated, and corrected before you act on it. EB strikes when you let your best guess of what’s about to happen become so strong that it affects your perceptions of what is really happening and overwhelms any chance to revise the plan. EB leads you to the dark side of faulty situational awareness and that’s the EB trap that all the pilots I talked about last month fell into.

The Delta flight crew descending into Rapid City demonstrated a common symptom of expectation bias, something psychologists call “anchoring.” It’s the natural tendency to get stuck on information gained early on in the decision-making process and make it your final judgment. No matter how much they might have anticipated it to be true, the first runway they spotted as they broke out of the clouds was not what they had thought it would be. They just couldn’t push themselves off their old expectation and reassess that the runway they saw was close but not the correct airfield. The EB trap snapped shut on them because they got stuck on their first guess, anchored into it, and disregarded additional information they should have used to reassess, check, and adjust their SA picture. The takeaway lesson from this is to resist the temptation of letting your assumptions become your reality without rechecking it and never let your expectations taint your perceptions so much that you lose objectivity.

This is exactly what Francis Bacon observed 350 years ago. You can only overcome this by weighing sources of incoming data and relying more heavily on reliable sources like your instruments and disregarding usually unreliable sources, like those lyin’ eyes. This requires constant vigilance and it’s an active mental process. Actively check your usually “reliable sources.” Learn to be critical of your own thought process and seek to disprove your expectations by using this constant vigilance to get your SA right and keep it current and correct. This is the second lesson to take away from these examples, and something none of the flight crews seemed to have done is a simple skill we have all drilled for years: use your instruments. There’s nothing subjective about what they’re telling you, so don’t allow your lyin’ eyes to tell you something else. Certainly, the needles would not have been centered for an approach into Rapid City while their airplane was descending into Ellsworth AFB no matter what the crew saw outside the cockpit. Always remember that your instrument panel provides powerful aids to anticipate and counteract expectation bias.

Setting up your instruments properly can help you avoid potential problems even before you leave the ground. Make it part of your flying habits to set the course selector needle and heading bug on your HSI to the heading of your assigned runway as soon as you get it from the tower. If you ever line up for takeoff and the course indicator arrow is not pointed exactly at the heading bug, you are about to use the wrong runway. This simple little step takes all the last-minute guesswork out of the takeoff sequence, and it would have alerted the Delta Connection crew on flight 5191 in Lexington KY that the nose of their regional jet was pointed in the wrong direction, allowing them to avoid the disaster that cost 49 people their lives. Use similar skills while in the air; use those “4 T’s” we all learned way back working on our instrument rating. As soon as you get a new heading assignment, “twist” the heading bug to it and then make sure your “turn” puts you on the right heading, the one that was assigned. That will prevent turning to an “expected” heading that you might have predicted but was not your most current instruction from ATC.

The pilots of Air Canada flight 759 in San Francisco were snared by another EB trap. Their expectation bias was rooted in the first mistake they made that evening. They failed to check the KSFO NOTAMS published for their arrival time so they lacked critical information to build their SA for their approach. If they had checked the current information, they would have known that runway 28L was closed and the lights were turned off at 10 PM local time. Like the crew headed to Branson, MO, they were also experienced pilots and had flown into San Francisco many times. They knew the runway configuration at KSFO so their expectation was to see two parallel runways, 28R and 28L, with two sets of lights. But what they actually saw was runway 28R and next to it another set of lights that were illuminating Taxiway Charlie. Even though runway and taxiway lights are different colors, white for runways and blue for taxiways, this still wasn’t enough to overcome the power of their expectation bias to see two parallel runways. They were cleared to land on 28R so they set up on the lights on the right side, a mistake that almost cost 1,000 passengers’ lives. Just like the other crews, they should have used their instruments, especially when approaching for landing on parallel runways with similar alignments and headings. Even though it was a visual approach in VFR conditions, if they had set the RNAV or ILS approach, and cross-checked to confirm the correct one had been dialed in, they would have been alerted to the mistake by seeing the CDI bar displaced. This simple step would have warned them that their expectations didn’t fit reality. Learn from this near-fatal mistake; set the approach even in VFR conditions and if ever your CDI is not lined up with the course selector arrow, you’re about to make a big mistake. When you do set up your instruments, do it as soon as ATC gives instructions while the information is still fresh in your mind. Add these takeoff and landing steps to your flying habits and checklists and do it every time.

Lack of precise communication is another of the early casualties of EB since it will alter your understanding of information that might change your previous plans. This mistake was one of the elements in the Tenerife disaster. When issued instructions by ATC, concentrate on the words and focus on listening; then repeat it out loud then read back to ATC exactly what you thought you heard. This is the reason that read-backs and hear-backs are Federal Aviation Regulation mandates. Then apply that new information actively with an open mind to change your plans if that’s what’s called for. If something still doesn’t make sense, like an incorrect call sign, runway assignment, altitude or approach heading, don’t hesitate to check on it. Never be afraid to ask ATC to “say again, please.” It’s right there in the FAR vocabulary list and expected by ATC to avoid misunderstandings. Nobody at ATC or in the air around you wants any confusion about where you should take your airplane. Practice listening and communication skills and take training and refresher seminars on communication precision for accurate information exchange.

Again, like Francis Bacon warned so long ago, don’t cherry-pick your data or try to select limited information to make new decisions or support the ones you made previously. Use all of the data you have and make sure that everything is consistent. If something feels wrong, then something is wrong, so tell ATC you think it’s wrong. Fight your expectation bias by testing your assumptions and aim to disprove your expectations. Leave your ego on the ground; you very well might be wrong, so always be amenable to change your assumptions. All of this adds up to constantly reassessing and rebuilding your SA, testing it against “reality,” and making sure it matches. If it doesn’t, something is wrong so tell ATC you need to circle or just hold a heading until you sort things out.

Related to this tendency to cherry-pick data to prove our lyin’ eyes are really telling us the truth is something I’ve been seeing around the web lately called the “Google Effect.” The way people use search engines these days feeds into the worst of our expectation bias. Modern Internet search engines are now so good, so effective, and contain so much information that anyone, on almost any topic, can find supporting “evidence.” Just because you can find any supporting evidence out there does not mean it’s conclusive. You have to make sure it fits into the most accurate SA picture you can build. Take all the information at hand in the context of the other information available, some of which might be contradictory of your expectations, and check and reassess your SA. We have a real tendency to disregard what we don’t like or doesn’t match our wishful thinking or strongly held expectations; resist this temptation.

The FAA has looked at all kinds of technological fixes for EB, but in the end if pilots are still willing to believe their lyin’ eyes, it will defeat all these technological add-ons. Don’t believe what you want to be real; constantly reassess and keep up your SA, make sure you set up your instruments correctly, and then trust them. I talked about being your own copilot in a two-part post a while back called “Who Will Guard the Guardians.” If you are by yourself, say everything out loud. There’s science behind this rule—the cognitive effort expended to process the information and form complete sentences and then say them out loud means you have completed the thought process, not just given it a passing, incomplete consideration. Listen to your own words to make sure it sounds right. Be skeptical and self-critical and always recheck your assumptions to fight off EB. Constantly reassess yourself by testing your predictions to be sure you’re not letting those lyin’ eyes suck you into the big black hole of expectation bias.

Kenneth Stahl, MD, FACS
Kenneth Stahl, MD, FACS is an expert in principles of aviation safety and has adapted those lessons to healthcare and industry for maximizing patient safety and minimizing human error. He also writes and teaches pilot and patient safety principles and error avoidance. He is triple board-certified in cardiac surgery, trauma surgery/surgical critical care and general surgery. Dr. Stahl holds an active ATP certification and a 25-year member of the AOPA with thousands of hours as pilot in command in multiple airframes. He serves on the AOPA Board of Aviation Medical Advisors and is a published author with numerous peer reviewed journal and medical textbook contributions. Dr. Stahl practices surgery and is active in writing and industry consulting. He can be reached at [email protected].

Related Articles