Boiling Frogs

We left things off last month talking about “that little voice” in the back of our heads (we anointed it “TLV”) and how it gives us really wise advice and critical safety information. 

We dissected where TLV comes from and how it gets filled with so much information and discussed some of the reasons why we find it so easy to ignore TLV, subjecting it to our own “Curse of Cassandra.” The message is clear; all that advice is great as long as you heed its warnings. That’s the million-dollar question that we didn’t answer: why don’t we listen, and why don’t we follow our own advice? It seems that an awful lot of bad outcomes are followed by some famous words like, “I knew I shouldn’t have done that!” Well, if you knew then why’d you do it? That answer brings us right back around to the top; you didn’t listen to your own advice.


There’s no doubt that developing our own private Curse of Cassandra is likely a slow process and I’d bet the psychology of disregarding our own advice is a little different for each of us. Bit by bit we bargain with TLV and stretch our own self-imposed safety limits until we’re totally outside of our safety envelope. Maybe you never flew your tanks less than one-fourth full without landing to top off, but every few flights you let them run a tiny bit lower until one day you find yourself up in the air, a long way from an airport, with a stiff head wind and dangerously close to fuel exhaustion. We covered some of the reasons how this can happen a while back in the article “Runnin’ on Empty,” but we can drill further into it and add more detail from this angle. Negotiating your fuel reserves with that little voice to reach a “compromise” may not seem like totally ignoring it, but still it counteracts all your previously self-imposed safety standards. Since it happens slowly we get further and further out on a safety limb. It brings to mind that old urban legend about cooking frogs.

The boiling frog fable has been around for a long time. The premise is that if you toss a frog into boiling water, it will jump out, but if the frog is put in cool water and brought to a boil slowly it will stay in the pot and get cooked. The legend says that with the slow, subtle change in its surroundings, the little guy just doesn’t perceive the danger until it’s too late. Maybe the frog was bargaining with its own little voice that the water isn’t all that hot – yet – but accepting the gradual change in his environment ends in a catastrophic conclusion. Because it happened slowly, he didn’t pay attention, just like we don’t pay attention. The fable is a metaphor for our inability, unwillingness, or just blindness to threats that creep up on us gradually rather than pop up suddenly.

Fortunately for frogs, the old legend is probably not true even though some 19th-century experiments suggested it was. In 1869 German physiologist Friedrich Goltz did some frog boiling experiments and showed that only a frog that has had its brain removed will remain in slowly heated water, but an intact frog bailed out as the water got hotter. There wouldn’t seem to be much to take away from an “experiment” like that and modern biologists have proven that the legend is false. A little guy that is gradually heated will jump out of the pot since it’s a fundamentally necessary strategy for frog survival.

But for us, with our brains still in our heads, the frog cooking legend sure seems to be true. So the questions remain, why do we let it happen again and again, how does it happen and what can we do to prevent it? As common as this is, it’s no surprise that lots of people have tried to figure out the answers to these questions. This kind of slippery slopeto disaster has been referred to as creeping normalcy. Creeping normalcy is one of those cognitive biases that we’ve talked about before. But this is one that’s a particularly insidious danger since we are totally blind to it.

Fred Rauch, 341st Space Wing Antiterrorism Officer at Malmstrom Air Force Base in Montana, wrote a great article on the topic in their wing safety bulletin. “Creeping normalcy refers to the way major, and often unacceptable, changes can be accepted as normal if it happens slowly in unnoticed increments. Somebody, either through a conscious decision, a mistake or just lack of caring, compromises the standard ever so slightly. It is not noticed, operators fail to correct the error, or rationalize it as satisfactory, or worse, simply totally ignore it. These errors are eventually accepted as normal and a new baseline is established. From there it happens again and again. Each step, each compromise, is not far from the new baseline, but over time, usually professional teams and individuals are operating very far from acceptable standards way below the original safety baseline.”

Complacency plays a big part in creeping normalcy and drives a process where negatives are slowly allowed into our daily habits in such small increments that it eventually becomes the “new normal” and sets up trends that are as dangerous as they are undetectable. They accumulate over time, baby step by baby step, into extremely serious problems. Perception is altered because the changes are gradual and it all fits with the frog in the pot thing. Back in 1888William Thompson Sedgwick would have agreed with Mr. Rauch. He said that it all comes back to different heating rates: “The truth appears to be that if the heating be sufficiently gradual, no reflex movements will be produced even in the normal frog; if it be more rapid, such is not the case.” All these concepts of frog boiling, creeping normalcy, and complacency reflect right back to our safety habits. But it’s not just making good habits, another topic we’ve covered in detail, it’s sticking to them and allowing TLV to remind us of the good habits we’ve formed and keep us from creeping toward disaster.

Let’s go back to the fuel exhaustion example above. Running out of gas is an avoidable disaster. Over land there might be some outs, but it has even greater consequences for pilots who fly airplanes over large open stretches of ocean. Military pilots have a term, “BINGO,” in their SOPs that doesn’t refer to a parlor game; it’s a deadly serious safety concept. BINGO defines an actual point in time and space beyond which the airplane is not able to safely get back over land and reach an airport. Just like we do our fuel calculations, this point in space is determined by fuel load and burn, distance back to an appropriate airport, wind, weather and aircraft performance. Using these factors an actual physical box is drawn on the map that has sides defined by those numbers and as long as the BINGO point is within the box, the crew is always within their safety envelope and able to get back to dry land. The FAA also uses the safety box concept and has rolled it into some pretty specific regulations governing intercontinental commercial carriers covering fuel, and emergency single engine safety standards are called “ETOPS” (Extended-range Twin-engine Operational Performance Standards).

This concept of a safety box is a really useful image for us in our GA cockpits too, but we are left up to ourselves to enforce our own standards. The value of the “safety box” for us is that it can be defined with actual numbers and data. Your own safety box needs to take into account all the human factors (defined right here in the AOPA pages, the “IMSAFE“ model), your currency and skill for the mission, aircraft performance, fuel load and burn, wind and weather. The whole system, in theory, is then monitored by TLV. As long as we allow it, TLV serves as a constant reminder to stay inside the boundaries of the safety box. In the military cockpit one of the pilots will call out “5 minutes to BINGO,” not that they have all the numbers covered on their game card, but to remind all crew members that the airplane is getting near the “edge of the box” and it’s about time to turn around. It’s worth reinforcing the point, when you’re flying single pilot by yourself, you only have TLV to remind you that you’re getting too close to the edge of the box, or worse, beyond the edges.

Another useful part of the analogy is that establishing the size of the box or how much leeway you allow yourself around the edges pretty much defines your safety habits. Your risks increase incrementally the closer you get to the edge, or worse, the farther outside the boundaries of the safety box you let yourself wander. It’s crucial to know when you’re out there at the edge and not just accept it as the “new normal” beyond your safety envelope. The more creeping normalcy you accept, the further out you’ll find yourself. If you keep wandering to the borders of the box and beyond because you just don’t listen to your own advice, you’re testing your own frog boiling limits. By normalizing your own deviancy, you’ll soon be spending a lot of time outside your safety box, and we’ve seen the disasters that can result. Also keep in mind it’s necessary to determine the borders of the box for every flight since they’re always different. It can even change right in the middle of a flight when you might be forced outside the box due to uncertainties of weather, wind or mechanical issues that impact current flight conditions. That’s the safety value of contingency planning.

Not only do you need to hear your little voice’s warnings that you’re coming up on your BINGO, you have to heed TLV telling you how to get back inside the box. As we’ve covered, TLV usually has the right answer for you just like all the examples last month with TLV telling you that the nearest exit is behind you, or about the river under you, or the long, straight empty road in front of your burning airplane. Twenty-five hundred years ago Aristotle said, “You are what you repeatedly do. Excellence is not an event, it is a habit.” In the two and a half millennia since he said that not much has changed. Develop good habits and REPEATEDLY stick to them. Define and maintain the borders of your safety box. Don’t ever bargain with your little voice to creep outside it and compromise your safety standards because soon you’ll be living out there. Don’t try to convince yourself that the water isn’t all that hot because soon it will be. Heed your own advice, listen to your little voice to stay in the box, and you’ll stay safe!

Kenneth Stahl, MD, FACS
Kenneth Stahl, MD, FACS is an expert in principles of aviation safety and has adapted those lessons to healthcare and industry for maximizing patient safety and minimizing human error. He also writes and teaches pilot and patient safety principles and error avoidance. He is triple board-certified in cardiac surgery, trauma surgery/surgical critical care and general surgery. Dr. Stahl holds an active ATP certification and a 25-year member of the AOPA with thousands of hours as pilot in command in multiple airframes. He serves on the AOPA Board of Aviation Medical Advisors and is a published author with numerous peer reviewed journal and medical textbook contributions. Dr. Stahl practices surgery and is active in writing and industry consulting. He can be reached at [email protected].
Topics: Pilot Health and Medical Certification, Pilot Health and Medical Certification

Related Articles