Menu

Why Pencils Have Erasers

A few months ago my wife and I rented a condo in a little beach town on the Florida coast a bit south of the Kennedy Space Center.

There’s a tiki bar in the village that’s decorated with a bunch of funny signs with cute one-liners and getting a burger there means you end up reading all the signs around the place while you wait for it to cook out back. The other day I saw one that got me thinking again about our lives as pilots. It said, “No one is perfect, that’s why pencils have erasers.” Pencils have erasers so if you mess something up you can just make it disappear and do it all over again. But the downside of erasers on pencils is that it encourages, or at least allows, people to be sloppy and careless about mistakes since they figure in the end it really doesn’t matter very much. Pilots’ pencils don’t have erasers for the simple reason that we just don’t get the chance to have many do-overs. It’s kind of like learning to be careless, since the eraser is always there on the other end of the pencil to bail you out. It’s the theme preached in a bunch of the safety articles here at the AOPA and we also talked about in the article last month.

This concept of “learned carelessness” isn’t new and has been looked at as a source of aviation accidents for a while. Two authors, Christopher Wickens and Justin Hollands, came up with one way to explain it with an idea that humans are “cognitively lazy.” We tend to follow the path of the least cognitive resistance. When things are all great and it’s easy to accomplish something it becomes positively reinforcing. We get sloppy, and that increases the likelihood of getting it wrong. The result of that can be a tragedy. How we get ourselves into this “cognitively lazy” trap is a little more complicated. One possible explanation is found in a great review by two German authors, Christoph Moehlenbrink and Meike Jipp, that they presented at a recent meeting of the Society of Human Factors. Their theory of learned carelessness and cognitive laxity is that we are victims of our own success. Since everything usually goes well we let our safety guard down and think that if anything was going to go wrong there’s always that eraser end of the pencil to fix it.

Moehlenbrink and Jipp came up with this idea with a flight simulator study of commercial airline crews. Their study looked at how easy it was to lull flight crews into making mistakes by missing incorrect data loaded into their cockpit software. They had two groups of flight officers who were presented with two types of flight plans. One group was given flight plans that always had mistakes and the other group always got correct flight plans – except for the last one when they got data that had a serious error. The results aren’t surprising. The folks who got the bad ones all the time were kept up on their toes by the consistent mistakes and became pretty good at spotting the errors, but the group that got the correct flight plans every time except the last one missed the fatal flaw. The conclusion was the group with the wrong data never let their guard down – they maintained their “cognitive vigilance” and learned to be careful – but the group with the correct plans got so comfortable that they “learned” to be careless. This group became victims of their own success and was lulled by routine procedures that were consistently error-free. In the end they got sloppy and evolved bad habits and safety critical shortcuts.

NTSB accident reports show this has some real-life consequences. One incident happened on May 22, 2015, in Paris when Air France cargo flight AF6724, a Boeing 777 taking off for Mexico City, suffered a tail strike on rotation and only cleared the fence at the end of the runway by less than 100 feet. It turns out that the first officer erroneously entered the takeoff weight into the flight management system (FMS) that was 100 tons (200,000 lbs) too light and the captain, who was so used to always seeing correct data, missed the mistake. According to the investigation by the French Civil Directorate General of Aviation, the calculated VR was 20 kts too slow for the actual weight of the airplane and the plane couldn’t get into the air at the slow speed. When the captain brought the nose up the tail hit the runway. The airplane finally sped up to the required rotation speed and got airborne, but by then it was almost too late to clear the obstacles at the end of the runway. This was a real potential disaster as the plane was almost out of runway and would have plowed into a residential area at the end of the runway, the same neighborhood that Concorde AF 4590 crashed into, killing all the passengers and 10 people on the ground. The disaster was only barely avoided by the captain, who kept accelerating the airplane instead of trying to abort the takeoff without enough room to stop. It turns out that one of the relief officers on the flight deck even noticed the mistake but didn’t say anything.

The pilots on the flight deck fell nicely into the two study groups in the German report: the captain was used to correct flight plans and missed the error while the relief FO was used to checking for mistakes and saw it. Too bad the RFO didn’t say anything. These ideas of cognitive laziness and carelessness might also help to explain what’s behind data from the FAA that there is a U-shaped curve for accident rates with respect to flight experience. The rate of GA aviation accidents decreases as pilots gain more flight time, but only up to a point, about 3,000 hours; after that accidents start to go up again as pilots log more time. The takeaway lessons from all of this for us pilots in the GA world is that as the perception of risk gets less, the effort to guard against mistakes also goes down, and that means our chance of making a critical mistake is more likely. It’s like looking without seeing. If you let yourself get used to the eraser end of the pencil you’re more likely to make a bad mistake. It fits nicely with the theme on “The Wrong Stuff“ from a few months back. There are lots of human factors engineers who have tried to design error-trapping algorithms into cockpit systems, but in the end there’s always a workaround that we can find if we let ourselves get lulled into being careless. Once learned carelessness has set in it will distort our perception, selection, and interpretation of subsequent information since it impairs our motivation to detect errors. Most of these errors fall through the cracks because we’ve become complacent and careless as we get numbed into missing things after so many error-free repetitions.

The FAA has put a lot of effort into looking at this problem to enhance aviation safety and I have to send a shout-out of thanks to Dr. Bill Kaliardos, who is the head of Human Factors Integration at the FAA. After reading the article last month he sent me some great feedback and links to research they have been doing. His work details five elements of pilot “learned carelessness” that include degradation of knowledge and skills, reliance on automation as the primary way to detect problems, reduced attentiveness, alarm fatigue (false alarms or ignoring alarms), and automation-induced effects on situational awareness and teamwork. This all boils down to a real problem: our suspicion and sense of risk get disarmed after hearing the lullabies of repeated easy answers.

There are some important things we can do about our own habits in the cockpit to guard against these kinds of errors. Bill Kaliardos’s FAA study group recommends some solutions. The AOPA Air Safety Institute has talked about a number of these and we’ve covered a bunch in this space also (links to each attached) that include:

            1. Always use your checklists and use them diligently and carefully

            2. Stay focused “in the moment”; pay attention to every detail

            3. Maintain a mindset that there are mistakes out there that you haven’t found yet

            4. Avoid distractions and maintain sterile cockpit rules

            5. Filter out unimportant information

            6. Maintain a clean workspace in the cockpit with both physical and mental organization

            7. Assume the automation is wrong

            8. Improve time management

            9. Don’t accept “careless” mistakes; insist on getting it right all the time

            10. Listen attentively and demand accurate and complete read-backs

            11. Keep a log of your errors and don’t repeat them

            12. Always double-check your assumptions

            13. Train for perfection

            14. Use good single pilot or crew CRM

The bottom line from all of this is that it’s not acceptable to think of mistakes as just the price of being human. We have to manage our responsibilities in the cockpit in pen, not pencil. You can’t erase anything and start over; you just have to get it right the first time and every time.

Kenneth Stahl, MD, FACS

Kenneth Stahl, MD, FACS is an expert in principles of aviation safety and has adapted those lessons to healthcare and industry for maximizing patient safety and minimizing human error. He also writes and teaches pilot and patient safety principles and error avoidance. He is triple board-certified in cardiac surgery, trauma surgery/surgical critical care and general surgery. Dr. Stahl holds an active ATP certification and a 25-year member of the AOPA with thousands of hours as pilot in command in multiple airframes. He serves on the AOPA Board of Aviation Medical Advisors and is a published author with numerous peer reviewed journal and medical textbook contributions. Dr. Stahl practices surgery and is active in writing and industry consulting. He can be reached at [email protected].

Related Articles