In a recent series of TV ads, Post Cereals declares that it put the "no" in "innovation" by refusing to alter the 118-year-old recipe for its Shredded Wheat cereal. "We're doing the right thing—nothing," proclaims one of the ads. In another ad, Frank Druffel, the fictional CEO, enthusiastically hires a job candidate to lead product development because the candidate has no skills or experience and can therefore be trusted not to change a thing. It's funny stuff, and probably the right move for a basic food product.
Whether intentionally or not, the ad taps an undercurrent of skepticism that I hear from time to time about whether innovation might just be overrated. The question, however, is not whether innovation matters, but how to make it matter more.
Creativity, by itself, is not enough. As I've previously written in this space, inventions that aren't commercialized—no matter how creative—remain inventions, not innovations. To be commercial, an invention needs to matter enough to a customer to be worth paying for. And what matters to most customers is not the invention itself but what job it enables them to do that they couldn't do, or do well enough, before. The microwave, for example, when it was first introduced, was a terrible oven, but it was fantastic defroster—and to many customers it was worth quite a lot to be able to keep food safe in the freezer until moments before they cooked dinner rather than have to think about it the morning or the night before.
When Enough Is Enough
Companies that get innovation wrong usually do so in one of two ways: They don't go far enough in nailing the "job-to-be-done," or they go too far, tacking on bells and whistles that customers don't value.
Business history is littered with examples of innovations that went too far. Think of the Concorde, a supersonic transport that could take passengers from New York to Paris in half the time of the next fastest airliner. It was a marvel of aeronautical engineering, but apparently, unlike the microwave, the time saved was not worth the price needed to make it profitable, and only 20 were ever built. Or consider the Segway, the two-wheeled, self-balancing vehicle that was supposed to revolutionize personal transportation but that almost 10 years after its introduction seems not to solve any job many people need to do and so remains a niche product for highly specialized uses. The current rush to produce 3D television sets could be the next such miscalculation: It remains to be seen whether large numbers of customers want 3D so badly that they will replace their expensive HD models so soon after replacing their old units.
Often companies go too far because they battle with competitors over marginal differences instead of concentrating on what customers really value, opening up opportunities for new players to step into the void. Think, for example, of the way Boeing (BA) and Airbus (EAD) have been one-upping each other with progressively bigger, more expensive, and longer-range aircraft, while Embraer (ERJ) and Bombardier (BBD/A:CN) outflanked both of them with small regional jets. It's that kind of thing, I suspect, that gives innovation a bad name.
Stuck in a Rut
But business history is also littered with examples of venerable companies that didn't go far enough—think of the typewriter companies, like Underwood and Smith Corona, that were century-old names when word processors first appeared.
A perhaps less-obvious example is Dell Computers (DELL). The Dell Direct business model, combining customizable computers and a highly efficient "pull" value chain, certainly qualified as an innovation when Michael Dell pioneered it under the name PCs Limited back in 1984. For more than 20 years, the company continued to innovate in that core business—adding e-commerce and introducing new supply chain efficiencies, manufacturing methods, and low-cost product designs. But when computers became a commodity and customization unnecessary, the company found itself in a bind. While it was busy managing that core business, it had neglected to work in parallel on breakthrough innovations that might have created the next new business—the kind of big bets that enable companies to reinvent themselves.
In both cases—too far and not far enough—the presumed innovation simply doesn't matter. And the failure to matter can usually be traced to a failure to manage innovation appropriately—a failure both to divide resources properly between current innovation and future innovation and a failure to make sure the inventions really do solve some job that needs to be done.
From the Bottom, or the Top?
At first glance, Google (GOOG) appears to stand in sharp contrast to Dell as a company that manages innovation well. Google engineers can devote as much as 20 percent of their time to ideas outside the company's core business that personally interest them. In an op-ed in The Washington Post, Google's then-CEO, Eric Schmidt, stated his faith in this approach, writing that "innovation is often driven from the bottom up."
While I give points to Google for recognizing the importance of innovation outside the core and for having an explicit process for doing it, I believe bottom-up invention can get a company only so far. It might arm the business with a portfolio of options for the next new thing, but only top management can decide which bets to place when and—critically—allow for the appropriate business model innovations that are so often needed to commercialize them. Perhaps recognition of that fact is what drove the company to make co-founder Larry Page the new CEO. It has been widely speculated that the intent of the management change is to speed up innovation by driving it more from the top.
The company that best understands how to make innovation matter is predictably, but justifiably, Apple (AAPL). From iTunes to the iPod, iPhone, iPad, and Apple stores, the company has repeatedly produced innovations in the core business as well as in new businesses, such as telecommunications and music, to create sustained growth. The constant has been Steve Jobs's obsession with the user experience and a willingness to innovate not just products but also services and business models to deliver it. His obsession ensures that innovation starts at the top—in fact, designers at Apple report directly to Jobs—and that innovations are harnessed to commercialization at the outset. Although Jobs is on medical leave, he remains CEO and has said he will stay involved in strategic decisions. Further, he has so firmly imprinted his approach to innovation on the company that during his previous medical leave its stock rose 66 percent, at a time during which the benchmark index for American equities rose just 10 percent.
Innovation takes institutional courage, and that can come only from the top. Not every Apple innovation has succeeded. The far too expensive Lisa computer failed before the Mac succeeded. And during the period when Jobs left the company, the Macintosh laptop failed before the PowerBook caught on and Newton, Apple's first tablet platform, was a disaster. Like Google, the company realizes that some failure is inevitable before you have an innovation worth harvesting. But Apple is now organized internally to make sure that the breakthroughs come with some regularity in the core business and in new businesses—and that those innovations matter.
Even the comically innovation-averse Frank Druffel gets it half right. At the end of the "no" in "innovation" ad, he admits that Post has introduced at least one innovation—bite-size Shredded Wheat. But he quickly adds, "Did we go too far?"