Menu

Avoid the Clueless Trap

We’ve spent a lot of time over the last several months drilling down into how we make decisions, formulate plans, and then execute them.  

With all that as background, it seems now is the time to take a deeper look into another piece of this complex puzzle, the decision-making process itself.  Just like learning anatomy in medical school, to understand the anatomy of our decision-making methods, we need to dissect the whole process and peel back all the layers.  When we get to the bottom we end up at a very basic level; figuring out just how it is we think about thinking. Many of you who have followed along in these pages probably know by now that I'm really fascinated with these kinds of puzzles, thinking about how we think about and process the information we use to solve complex problems.  It's something psychologists call "metacognition" and we’ll come back to it in a minute, but first we need to answer the question of why understanding our own thought process is such an important safety skill to bring to everything we do, especially when we’re in the cockpit.  Since it’s obvious that all decisions have both a bright side and a dark side the answer is simple.  The dark side of the decision-making process is making the wrong one, and without a basic understanding of how our thought process works we’re left clueless to assess and judge the quality of our decisions.  And if we don’t have that understanding we’re just blindly trying to recognize bad ones that lead to wrong choices. It doesn't matter if we're trying to stay safe in the air on the road or keeping patients safe in the operating room; it all hinges on an understanding and self-awareness of our decision-making process. 

Two psychologists, Justin Kruger, PhD, and David Dunning, PhD, published a study on just how dangerous it can be when we try to make important decisions but lack insight into the decision-making process.  Their paper in the Journal of Personality and Social Psychology is a really scary but still very insightful study drilling down into the risks of this lack of self-awareness and self-assessment.  The title says it all: "Unskilled and Unaware: Difficulties in Recognizing One's Own Incompetence.” It's a great read and I recommend clicking the link and taking a couple of minutes to get some sobering insights into one of our most common decision-making pitfalls.  The investigators studied several groups of volunteer students who were given a variety of skills and material to learn and then the authors tested the students’ skill and knowledge when the test subjects “thought” they had mastered the material.  After the students took the tests, the investigators asked their subjects how they "thought" they had done on their exams.  Their findings were sobering; the worse the students’ performance on the objective tests, the better they "thought" they had done.  These poorly performing students were just clueless about their lack of knowledge and had no insight into the pitfalls of their own thinking process. At the other end of the spectrum, the test subjects who did the best but “thought” they had done poorly.  This second group contains the real "experts," the ones always striving to do better, questioning their skills, knowledge and performance, the type we all should want to be and the kind of person to have in the other seat in the cockpit.  Reviewing basic NTSB accident reports certainly confirms Kruger’s findings; the more clueless pilots are about their decision making, skills and problem-solving abilities, the more risk they accept and the greater the chance of failure. 

With all this in mind, let’s get back to the basic idea of thinking and evaluating our own thinking that I called “metacognition.”  It's a kind of fancy name but it’s one of the most basic skills we need to master for safety in everything we do whether it's down here on terra firma or up in the sky.  It's at the heart of self-awareness, of how we think about and mentally solve problems and then judge our decisions.  The term was coined by developmental psychologist John H. Flavell and comes from putting "meta," meaning "beyond," together with “cognition,” thought.  According to his theory published in the late 1970s there are three components necessary to understand and assess our own thought process.  The first component of his model is evaluating our own capabilities.  As Kruger’s study cited above proves, that's not always a fair and objective assessment.  There are real risks in overestimating our capabilities as Kruger showed; overconfidence from having performed well in the past leads to less accurate judgments of our performance on future tasks, and that’s a clear danger for pilots.  The opposite is also true.  As his study shows, the test subjects who were not as confident tended to do better on complex tasks.  That leads to the second part of understanding our own thought process, procedural knowledge; how we perceive and evaluate the difficulty of a task. Not having this basic capacity to assess risk leads to the infamous "hold my coffee and watch this" remark that makes those of us in the right seat cringe. Combine an underevaluation of how difficult a task is with an overestimation of our capabilities sets us up for a disaster. The third component of metacognition is strategic knowledge that Flavell defines as our capability for using various strategies to learn information.  Strategic knowledge to master the complex tasks of aviation is a really important part of reaching “expertise” in flying airplanes and everything else we do.  It’s something that can only be developed with practice, repetition, and experience. Metacognition, at its heart, refers to a level of thinking that involves active control over our thought process and is the bedrock of an understanding of the decision-making process itself.  With this knowledge we can accurately assess the quality of our decisions and problem-solving plans and avoid the clueless trap that’s so well stated in Kruger’s title, “Unskilled and Unaware.”

There’s lots of material out there in the aviation training world that teaches thinking skills to improve decision-making.  Some really good stuff is right here at the AOPA where you can take a whole course titled "Aeronautical Decision Making" (ADM). There certainly is value in all the skills these models propose but our deeper dive into the mechanism, not just the rote steps, of decision-making is also important because with it comes the capacity to judge, assess and evaluate our decisions.  The classic "DECIDE" acronym for ADM has shown that teaching students metacognitive strategies can improve learning and master not only the vocabulary of expertise but also a more profound level of logical and thoughtful approaches to problem solving.  World champion boxer Muhammad Ali, always known for his deep insight and eloquence, once said, "What you are thinking about, you are becoming."  Make the effort to really think about how you think, how you assess risk, process information and self-police your decisions that have life and death consequences.  With that understanding, "what you are thinking about” and how you think about it will keep you safe.

Kenneth Stahl, MD, FACS
Kenneth Stahl, MD, FACS is an expert in principles of aviation safety and has adapted those lessons to healthcare and industry for maximizing patient safety and minimizing human error. He also writes and teaches pilot and patient safety principles and error avoidance. He is triple board-certified in cardiac surgery, trauma surgery/surgical critical care and general surgery. Dr. Stahl holds an active ATP certification and a 25-year member of the AOPA with thousands of hours as pilot in command in multiple airframes. He serves on the AOPA Board of Aviation Medical Advisors and is a published author with numerous peer reviewed journal and medical textbook contributions. Dr. Stahl practices surgery and is active in writing and industry consulting. He can be reached at [email protected].
Topics: Pilot Health and Medical Certification, Pilot Health and Medical Certification

Related Articles