Connecting decision makers to a dynamic network of information, people and ideas, Bloomberg quickly and accurately delivers business and financial information, news and insight around the world.
+1 212 318 2000
Europe, Middle East, & Africa
+44 20 7330 7500
+65 6212 1000
Don’t know if you caught Jon Stewart the other night. Writerless, “The” Daily Show has downgraded and is now “A” Daily Show, but Stewart is as on point as ever. On the show in question, he interviewed John Zogby, of pollster organization, Zogby International.
A couple of things struck me, not least the bizarrely poor performance by Zogby (is it just me or did he essentially instantly accede to Stewart’s suggestion that political polling isn’t actually at all useful? Video embedded after the jump so you can judge for yourself.) But more relevantly, one of Stewart’s questions struck a chord: “Does the data overwhelm the idea?” In a political context, this is a jab at the media as much as the pollsters, but it got me thinking about attempts to quantify innovation. There’s no Accepted Innovation Evaluation Methodology out there. Rather, there are tons of theories, often very scientific sounding and equally often utter bunkum.
That's not to say that attempting to evaluate innovation strategies is a waste of time. Rather that there are lots of smart people/companies out there promising to have the answers, yet it often feels like they're all pointing in completely different directions.
Before blinding themselves with too many options, executives looking to implement an innovation idea or strategy need to carefully assess what kind of data would be most useful in deciding whether their particular idea or strategy is worth pursuing/has been successful. There are so many ways to gather information these days, not to mention so many ways in which to slice and dice results, it's important to work out what will really be useful in that particular context. And sometimes, of course, a radical proposal will have no precedent and might end up demanding a leap of faith which can have no pre-bet guarantee.
Back in 2003, usability expert Jakob Nielsen wrote some similar advice for those attempting to evaluate the then still fairly brave new world of online user interfaces: "Market research methods such as focus groups and customer satisfaction surveys are great at researching your positioning or which messages to choose for an advertising campaign. They are not good at deciding user interface questions -- in fact, they're often misleading," he wrote. "Seeing something demo'd and actually having to use it are two very different things. Likewise, what customers say and what customers do rarely line up; listening to customers uses the wrong method to collect the wrong data."
Though far from perfect, received UI design wisdom has come a long way since then. But there's definitely still a lot to be learned about evaluating innovation. There's a desperate hunger both for the discipline of innovation and a way to box it up neatly. But there's also a dangerous desire to gorge on slickly-presented meaninglessness which briefly sates even as it fails to nourish. And in the long run that doesn't help anyone at all.
Here's the Jon Stewart clip:
What comes next? The Bloomberg Businessweek Innovation and Design blog chronicles new tools for creativity and collaboration, innovation case studies in both the corporate and social sectors, and the new ideas that have the power to change the way things have always been done.