Menu

Almost

In his 2014 epic fantasy novel,Words of Radiance,” Brandon Sanderson asked, “Can something be ‘almost’ an accident?” 

You bet it can, and understanding why that’s so true is one of the most important elements to stay safe in the air. A couple of months ago we left off our deep dive into personal methods of threat and error management (TEM) with a way to uncover your own safety risks by doing a personal root cause analysis (RCA) after an incident.  I commented, but a bit briefly, on the value of studying “almosts,” those near misses that are “almost” an accident. A few years ago Don Rumsfeld, the former Secretary of Defense, was asked if he would do anything differently in northwest Asia if he could do it all over. His only comment was, “We followed the wrong rabbit.” That’s a gigantic mistake, but you can avoid those kinds of events with an honest look at your own mistakes. The RCA process puts you on the trail of the right error rabbit. Chasing the wrong rabbit leads to repeated mistakes that only get more serious each time. Think back to the post on “Why Pencils have Erasers“ and recall my point that pilots rarely get do-overs. That’s why “almost” is such an important topic that deserves a more detailed discussion.

If you still need more proof on the value of studying near miss events I suggest you spend a couple of evenings reading one of the creepiest, scariest books I’ve ever read, “Command and Control,” by Eric Schlosser. Just don’t expect a great night’s sleep after you’re done reading it. It’s a 2013 documentary and an awesome work of research that details multiple mishaps (the Air Force’s euphemistic code name was “Broken Arrow”) that characterized American nuclear power development and nuclear weapons research and deployment. Mistakes happened right from the start with the first chain reaction of the Manhattan Project that went bad in a squash court under the viewing stands of Stagg Field at the University of Chicago in December 1942. It was pretty much downhill from there in regards to incidents, errors, missteps and near disasters (“almosts”). This included dozens of 42-thousand-pound, 10-megaton thermonuclear bombs falling from USAF bombers (a couple rained down over South Carolina and New Mexico!), being mishandled, mislabeled, “misplaced“ (yikes), and generally mistreated. Just reading that book makes you wonder how the planet ever survived two countries (the Russians actually had more incidents) racing each other for decades to develop the most destructive weapons imaginable. We obviously did survive but that was only by learning from all those near misses, getting proactive more than reactive, and never letting the same mistakes happen again.

The plain fact is accidents don’t happen “by accident.” Human factors and safety experts have known for years that a bunch of near misses portend an upcoming disaster. History has shown repeatedly that most accidents were preceded by these kinds of warnings and near accidents. For this reason, they have been talking about the value of analyzing near miss data for a while. Near misses are certainly a cheaper way, in both money and human terms, to learn safety lessons. They are zero-cost learning opportunities (compared to learning from actual tragedies). Charles Perrow in his 1984 book Normal Accidents claims that accidents are to be “expected” and are just the “price of being human” or the “price of doing business” in our potentially risky occupation. “Our ability to organize does not match the inherent hazards of our organized activities. What appears to be a rare exception, an anomaly, a one-in-a-million accident, is actually to be expected. It is normal.” This can’t possibly be true or the world wouldn’t have survived the nuclear arms races of the 1950s and ’60s. Yet what is likely inevitable are mistakes and human errors, but I just can’t accept the theory that nothing can be done about errors to interrupt the onrushing freight train of accidents.

The profound wisdom of the classical aviation CRM theory has always been its starting point; that human error will happen. The brilliance of CRM is the system put in place that makes it possible to interrupt the cycle of minor errors leading to accidents. Incidents and accidents are the sum total of a number of smaller component missteps. Personal threat and error management, preventing bad outcomes, hinges on preventing all these little “missteps” from propagating all the way to tragedy. That can be done by analyzing and learning from them.

There is a wealth of information in near miss analyses that’s right here in the aviation world. We’re all familiar with the Aviation Safety Reporting System (ASRS) that has been collecting voluntary reports of close calls from pilots, flight attendants, and air traffic controllers for close to 50 years. The theory is that if enough airmen tell the FAA (via a confidential NASA reporting system) about a similar problem, they can collate the trending events and publish the issues in order to correct the problems before a major calamity strikes. The ASRS was started after the 1976 TWA Flight 514 crash on approach to Dulles International Airport that killed all 92 people on the airplane. The investigation that followed found that the pilot misunderstood an ambiguous response from the Dulles air traffic controllers. But before that accident another airline had told its pilots, but not shared the information with others, about a similar near miss. Near misses are always easier to fess up to since it’s a mistake that didn’t cause harm. The ASRS system of collecting near misses has had a hugely positive impact on aviation safety. There are hundreds of near miss observations that lead to subsequent safety enhancements. Recent NTSB data shows that the rate of fatal aviation accidents has dropped about 65 percent, to one fatal accident in nearly 4.5 million departures, down from one accident in 2 million flights in 1997.

Think back to that old Jim Reason “Swiss cheese” model of incident propagation through “holes” in a system of supposed safeguards. What you can learn about your own flying habits by studying your near misses is where the “holes” are in your personal safety Swiss cheese. The takeaways from all of these crucial safety lessons are numerous: (1) Take the time to study your “almosts,” your near misses, with the RCA template we talked about before. Do a serious and thorough job of it with just as much emphasis as an actual event. (2) The more near misses you have, the closer you are to a real disaster, so keep track of them and get very concerned if near misses are creeping into your flying habits. (3) Correct every mistake you find. (4) Don’t repeat them. Stay safe by focusing on your own safety.

Kenneth Stahl, MD, FACS
Kenneth Stahl, MD, FACS is an expert in principles of aviation safety and has adapted those lessons to healthcare and industry for maximizing patient safety and minimizing human error. He also writes and teaches pilot and patient safety principles and error avoidance. He is triple board-certified in cardiac surgery, trauma surgery/surgical critical care and general surgery. Dr. Stahl holds an active ATP certification and a 25-year member of the AOPA with thousands of hours as pilot in command in multiple airframes. He serves on the AOPA Board of Aviation Medical Advisors and is a published author with numerous peer reviewed journal and medical textbook contributions. Dr. Stahl practices surgery and is active in writing and industry consulting. He can be reached at [email protected].

Related Articles