Harvard Business Online
Northwest Flight 253's Lessons for Leaders
The botched bombing of Northwest flight 253 from Amsterdam to Detroit haunts millions of air travelers facing (and wanting) tighter airport security. The lucky passengers on NW 253 subdued the terrorist and witnessed minor fires but no explosion. Now the important question is why was he allowed to board the plane in the first place?
Leaders should ask themselves the equivalent question about what can go wrong and how to prevent it in their own domains. Global financial crashes, regional power outages, laptops with pyrotechnic batteries, or tainted food ingredients, to name just a few, present similar problems. The NW 253 scare offers an instructive example of the roots of systemic failures. Every leader can learn a few lessons.
Human intervention. Databases are only as good as the human intelligence of the people who use them. The SEC was evidently tipped off to that something was fishy about Bernard Madoff but did not investigate deeply or stop the scam. Somehow hackers find gems in theft of data files, and climate change deniers can leak hundreds of British scientists' emails to the public, yet information pointing directly to the NW 253 suspect was lost in a watch-list database that is not particularly large. Data entry is a first step that means nothing by itself. Yet many professionals seem to think that if they send one email announcement, their job is done; now it's someone else's worry. Instead, it is critical to check whether messages are received and acted on. Leaders must push for relentless follow up, holding people accountable for what happens after they do their part.
Pattern recognition. To be meaningful, isolated pieces of information must be connected. The NW 253 debacle was preceded by missed signals and uncorrelated intelligence—however partial, incomplete, and non-obvious—as an unnamed federal official told New York Times reporters. But isn't non-obvious the point of secrets? If somebody stumbles upon a bit of information but works in isolation, he or she might not see its significance. In an era of social networking, instant messaging, and continual tweeting, it should be easy to encourage people to share and connect their data points to find patterns. Leaders should reward pattern-recognizers.
Wider communication. Communication across silos is even harder when external partners are involved. If news reports are accurate, British officials were investigating the NW 253 terrorist and put a hold on his student visa into the U.K., but might not have alerted American authorities who had previously issued a visa for the U.S. In a system with global dimensions, harmonization of standards should be a goal even if it means one partner ceding some protected information to others. Leaders should stress the importance of passing on items of value to others in the extended family network.
Credible leadership. Top leaders, especially the CEO, must take systemic failures seriously and personally, acting fast to assert control and reassure affected parties. Johnson & Johnson's former CEO James Burke became a management legend by immediately removing all Tylenol from shelves and the market after a tampering scare—no waiting, no obfuscation. In contrast, President Obama lost credibility following the NW 253 episode, by sending messages criticized as overly mild from a luxurious vacation criticized as overly mild before eventually springing into action—just as President Bush lost credibility by what was widely considered an overly casual reaction to Hurricane Katrina.
Furthermore, leaders should not let subordinates underplay matters of grave importance. After NW 253 landed safely, Homeland Security Secretary Janet Napolitano made statements about the system working when it had obviously failed. If there is uncertainty about what happened, leaders can say so. But if they are absent or casual, they lose support for the big changes to redesign the system.
Leaders set the tone about how seriously everyone else takes the situation. And that, in turn, determines how hard people work on the analysis, communication, and coordination that can catch the next vulnerabilities before they get out of hand (or board the plane). Accidents might be "normal" byproducts of complexity, as Yale sociologist Charles Perrow argues, but human actions can still make a difference.