At the same time as the Southwest 737 Flight 812 debacle was unfolding–almost as rapidly as the fuselage skin tore off the aircraft shortly after departure from Phoenix–a book crossed my desk that could have been written for the aviation industry, and Boeing and Southwest in particular. But the FAA could also take a lesson. Rarely do I read a book not written specifically for our industry that I think is a must-read for aviation executives and workers alike, from top to bottom, from airlines to GA, from captains to mechanics to gate agents and everyone in between. And, dare I say, most especially for the FAA. That book is Willful Blindness: Why We Ignore the Obvious at Our Peril by Margaret Heffernan.
The lessons of the book were hauntingly present as I read various media reports of Boeing appearing to take the public relations hit for its largest customer, saying it was surprised (surprised!) by the cracks in a high-cycle aircraft that let a five-foot skylight punch through the fuselage to the horror of the passengers and crew. According to The Wall Street Journal, a Boeing spokesperson referring to the damage on the 737, stated, “We did not expect this to happen.” Wow! Really? Boeing didn’t expect a high-cycle aircraft to develop cracks that could ultimately break open and peel away the skin of the fuselage? Sounds hard to believe.
I wondered whether Boeing executives thought the public was as willfully blind as they were pretending to be. To me it was an especially sad commentary on the major American aircraft manufacturer to be publicly stating that it didn’t know that cracks in this area of the fuselage could cause the metal to fatigue and give out. After Aloha’s hull ripped open and a flight attendant was sucked out to her death, issues of aging aircraft were on everyone’s radar. Or so I thought. Then they seemed to slip off the radar screen. But if we needed reminding, there was the 2009 hole in another Southwest 737, and the hole in an American 757 that caused an emergency landing in October 2010. It’s hard to believe that the danger of cracks anywhere on the fuselage of an aging aircraft causing a hole large enough to cause rapid decompression, fainting passengers and an emergency landing were unforeseen! That’s taking willful blindness to a whole new level. And the news’ spin seems nothing short of an attempt to dupe the traveling public!
As I sat down to write this article, the dramatic video of an Air France A380 spinning a Comair commuter on the taxiways of JFK played and replayed on every TV station. Somehow, the FAA was convinced to give the Port Authority of NY and NJ (the airport authority that runs JFK) a waiver of its taxiway width requirements. So, notwithstanding a wingspan significantly greater than that of the usual jumbo jet using JFK, the A380 was given a pass to use those taxiways. What was the Port thinking? I suppose their eyes were blinded by the thought of losing the new large-aircraft sweepstakes. But why did the FAA go along with it? It makes one wonder about some of these cozy relationships between the regulated and the regulator.
Developing 20/20 Foresight
But back to the importance of this book and its relevance to the Southwest damage, the Air France A380 incident at JFK and almost every accident I’ve ever investigated when I was on the NTSB or before. The author highlights a phenomenon all too familiar to accident investigators: why only in hindsight is the trail of missteps or corner-cutting so obvious? Why are whistleblowers ignored until it’s too late? A wide range of disasters–including the global economic meltdown; Enron’s collapse; the BP disaster in Texas in 2005, which foreshadowed the even larger catastrophe last summer in the Gulf; the scandals of Abu Graib and the lackluster reactions of FEMA and the federal government to the unfolding tragedy in New Orleans after Hurricane Katrina–fits into this category.
In every example, I see–and I believe you would, too–an obvious parallel to the aviation industry. Whether it’s the effects of fatigue on decision-making or outsourcing or oversight, we ignore these lessons at our own peril.
While the author gives only one example specific to aviation, she cites an NTSB study of 37 accidents that found that 25 percent could have been prevented if someone had challenged the pilot-in-command’s incorrect decision. Her analysis of the effects on decision making of factors such as fatigue, which can make workers too tired to see the impacts of their decisions and take on greater risk than they should, is clearly applicable to those of us in every facet of aviation. Interestingly, fatigue, according to the psychological studies she cites, can also make people more likely to make flawed moral judgments, allowing them to ignore clear danger signals. And financial pressures can, of course, cloud that moral vision even more. (Should anyone at Continental, Colgan or the FAA have really been surprised that a commuter pilot working long hours for less than subsistence wages, sleeping in pilot lounges, made faulty decisions trying to land in Buffalo at night in bad weather?)
The author’s comments on outsourcing, whistleblowers, fatigue and cost-cutting provide so many lessons for aviation and its regulators. I won’t spoil any more of the book other than to say that she does end on a hopeful note. Just as many of us choose to be “willfully” blind, so can we choose to “willfully” see.
Of course, that means taking responsibility for seeing what can be uncomfortable, expensive or politically inexpedient. That could be a tough order for some at the FAA. As a long-time Flight Standards employee once told me, “The buck stops nowhere there.” Well, FAA, prove me wrong. For starters, try fixing the maintenance manual errors, mandate kids’ seats, and do some serious (unannounced) auditing of outsourced airline maintenance. o