Bloomberg Anywhere Remote Login Bloomberg Terminal Demo Request


Connecting decision makers to a dynamic network of information, people and ideas, Bloomberg quickly and accurately delivers business and financial information, news and insight around the world.


Financial Products

Enterprise Products


Customer Support

  • Americas

    +1 212 318 2000

  • Europe, Middle East, & Africa

    +44 20 7330 7500

  • Asia Pacific

    +65 6212 1000


Industry Products

Media Services

Follow Us

Bloomberg Customers

Innovation, Interrupted

Posted by: Michael Mandel on June 04

Try this:

“We live in an era of rapid innovation.” I’m sure you’ve heard that phrase, or some variant, over and over again. The evidence appears to be all around us: Google (GOOG), Facebook, Twitter, smartphones, flat-screen televisions, the Internet itself.

But what if the conventional wisdom is wrong? What if outside of a few high-profile areas, the past decade has seen far too few commercial innovations that can transform lives and move the economy forward? What if, rather than being an era of rapid innovation, this has been an era of innovation interrupted? And if that’s true, is there any reason to expect the next decade to be any better?

You can find the rest of my story on the innovation shortfall here. My analysis of the economic statistics is here.

TrackBack URL for this entry:

Reader Comments

Brandon W

June 4, 2009 10:38 AM

RE: economic statistics
How long have I been arguing that all this math wasn't correlating to reality?

RE: innovation
It may be that we spent the 20th century grabbing all the low-hanging fruit. We know that the first 80% of anything is easy to attain, and then the last 20% gets exponentially more difficult as you approach the pinnacle. Improvement is not linear. Assuming there's a pinnacle of human development (a fair assumption, I think), perhaps we're at a stage where continued innovation is getting exponentially more difficult?

Perhaps we're butting up against natural limits to our intelligence, biological functions, and natural resources? I might imagine a world where everyone lives like billionaires and we all just teleport to other planets for lunch, but the limits of reality won't allow it.


June 4, 2009 03:36 PM

Michael, Thanks for writing such a thoughtful article about such an important subject. Let us hope, at least, that the dialogue intensifies. Reading this brings me to the realization that the most important questions of our time are mostly subject to politicization and a real dearth of serious, unbiased, methodical study. The hypotheses, theories, and monitoring of the complex systems and processes of civilization ought to represent the most important work we could do, yet it remains difficult to even ferret out enough meaningful data to offer a pertinent observation.

Brandon, Your thought about innovation seems quite plausible, especially in light of the fact that the number of humans who might innovate continues to increase. Another possibility is that we are simply ruling out, by definition, a lot of innovative activity. I do think we are overly focused on the idea that innovation has to have something to do with narrowly defined technology.


June 4, 2009 04:33 PM

I agree with LAO's first point: it's been a continual source of irritation ever since I matured enough to start reading the news.

I want to propose an amendment to BrandonW's point: the achievement curve (achievement vs. effort) is shallow at the high end of the curve, but also at the low-end. Start-up costs are often not cheap, and that's the reason Biotech is so hard still. We're still very early in the game. How long did semiconductor technology exist in physics labs before it hit the mainstream? But I don't think Shockley et al should be considered underachievers!


June 4, 2009 06:44 PM

I think that conventional wisdom has long been considered passe. That all that money was going into housing demonstrated how little people thought of it anymore. Cost cutting is cheap, sure, and profitable; innovation not so much. That is why it is only turned to as a last resort and everyone hopes someone else will save them the trouble.


June 5, 2009 12:06 AM


I think your conclusion is too heavily swayed by the slow progress of biotechnology.

In other areas, innovation has been rapid. Moore's Law progressed at exactly the usual rate during this decade, with no interruptions. This caused a gadget proliferation. Cellphones, Plasma/LCD TVs, iPods, household robots, home Wi-Fi, etc. did not exist in 1998. Hell, most people didn't even have laptops in 1998.

How are the electronics in a 2009 model car vs. a 1999 model car? Recall that most 1999 models still had a tape player rather than CD player, and analog odometer rather than digital.

Also, the innovation in this decade has been towards taking mature technologies and driving them down to super-low costs, for the benefit of India and China. This was a higher ROI than innovation than US biotechnology. Hence, real wages for knowledge workers rose 100% over this decade, in India and China.


June 5, 2009 12:16 AM

I disagree with the (quoted) diagnosis. Innovation has not been restricted to narrow fields, and neither has its growth stopped.

Sure, raised bars and low hanging fruits being picked are important issues, and I will not repeat myself at length here.

But I think the key shortfall has been in the failure to translate subject-matter innovation into actual improvements in people's quality of life, and to a large extent even to productize useful lab innovations.

My scratching-the-surface diagnosis of why that was/is is the total distortion of the usefulness metric and corruption of the social process, so that value is not attributed and focus is not given to that which is useful, but that which makes the quickest buck for the insider elites.

One phrase from the full article that is quite telling in this regard: "No gene therapy has yet been approved for sale in the U.S." That's how the usefulness of anything is established - how many bucks can the owners and middlemen make.


June 5, 2009 12:22 AM

In shorter words, the problem is not technology, but the current socioeconomic paradigm, or rather its usurpation by elites who are largely disassociated from any concept of subject-matter merit.

It has happened before, and the outcome has never been pretty.

Brandon W

June 5, 2009 10:55 AM

At some point, the silicon melts and you can't make a faster processor. Electrons will never move faster than the speed of light. Humans will never live forever; the biology breaks down. We run out of trees, or usable oil (which means no more plastics), or food, or oxygen (because we cut down all the trees). At some point the math gets more complicated than any human will ever comprehend. At some point things can't be produced any smaller. I think we're approaching the limits in quite a few of those areas. "Innovating" closer to those limits will get exponentially more difficult.

Digital odometers are not what I would consider innovation.

I do agree with Cm's point. It is similar to a point I've argued before, that we measure our advancement by increases in "quantity in life" rather than "quality of life."

Joseph Alexander

June 5, 2009 11:49 AM

I listened to your business week cover story podcast today. I disagree with a few aspects of your conclusions but I want to focus on what you said about recent college graduate and real wages. I do agree with you that our real wages have gone down but it is not because we have not innovated at a speed with which increase real wages. I would argue that the decline in real wages has not been a lack of innovation as much as an increase of domestic and international college with more and more people getting advanced degrees. I believe its more of an issue of simple demand and supply than a lack of innovation. Thank you for being a thought provoking journalists.

Joseph Alexander

June 5, 2009 11:51 AM

I listened to your business week cover story podcast today. I disagree with a few aspects of your conclusions but I want to focus on what you said about recent college graduate and real wages. I do agree with you that our real wages have gone down but it is not because we have not innovated at a speed with which increase real wages. I would argue that the decline in real wages has not been a lack of innovation as much as an increase of domestic and international college with more and more people getting advanced degrees. I believe its more of an issue of simple demand and supply than a lack of innovation. Thank you for being a thought provoking journalists.


June 5, 2009 11:52 AM

One of the factors that I believe we fail to appreciate is that there is often a two step effort with innovation. The first is recognizing that we have a powerful innovation. The second is identifying a commercially viable application with an appropriate risk reward relationship. As we have raise our societal and legal concerns over innovation in response to some of the fallout of the past, we have raised barriers to innovation and increased the difficulty of finding those viable opportunities. I believe we need to recognize that some of the leading areas of innovation could disastrous on a global scale if they are not handled correctly. And even if they are, we will likely see some stretching of the adoption curve due to classic fear, uncertainty and doubt. Genetically modified foods are a classic example.


June 5, 2009 11:53 AM

One of the factors that I believe we fail to appreciate is that there is often a two step effort with innovation. The first is recognizing that we have a powerful innovation. The second is identifying a commercially viable application with an appropriate risk reward relationship. As we have raise our societal and legal concerns over innovation in response to some of the fallout of the past, we have raised barriers to innovation and increased the difficulty of finding those viable opportunities. I believe we need to recognize that some of the leading areas of innovation could be disastrous on a global scale if they are not handled correctly. And even if they are, we will likely see some stretching of the adoption curve due to classic fear, uncertainty and doubt. Genetically modified foods are a classic example.


June 5, 2009 02:14 PM

There was still more innovation in this decade (1998-2008) than the prior one (1988-1998). Before one jumps in with the example of 'Internet', note that in 1998, dial-up was the only form of Internet available, and WW users were only 200 million, vs. 1.5 billion today.

Faster computing power leads to faster rates of scientific innovation. Sure, each innovation becomes harder, but the tools become progressively stronger.

Moore's Law has not stopped. Until it does, it has not.

A biotech letdown does not mean that all innovation has stalled. That is but one field.

Any leftist who thinks we are about to 'run out of trees' has clearly never been to Western Canada. There are literally billions upon billions of trees, without interruption, and without any human encroachment (as the population is not growing there). If every human on Earth was assigned their own tree, it would still not cover even 1/10th of British Columbia, let alone Alberta, Yukon, etc.

Brandon W

June 5, 2009 03:50 PM

Hah! Kartik, leave it to you to completely miss and dismiss a point by parading out the "leftist" straw man.


June 5, 2009 04:22 PM

Moore's Law is a good example of the difference between revolutionary ideas and revolutionary products: generally Moore's Law just meant incremental lithography improvements, which are not exactly a disruptive technology, but just a simple evolution of what we already know how to do. Granted, there have been a number of enabling process developments (SOI, strained silicon, hafnium gates, copper, tri-gate transistors, copper wiring, immersion lithography, etc.) that are more conceptually complex than simply shrinking line size. But the general lithographic trend has a very sudden end waiting for it. When that happens is when disruptive technologies (those I'd call truly innovative) would have to occur for Moore's Law to continue from the consumer side.
Even aside from process technologies, computer architects are still mostly refining the ideas put out 20 years ago by mainframe designers at IBM, while graphic architecture change have little more than scaling and a slow evolution towards general computing since the demise of 3dfx. Yet as Kartik continues to point out, changes in computing power, connectivity, and miniaturization have been amazing by objective measures and will probably continue for at least the next 20 years.

On the other hand, in Biotech and some other fields, we're learning all kinds of things that we never knew before but which are setting the stage for research and development that is really and truly new. But we haven't hit critical mass to be able to apply those ideas yet, so they look like duds even though really important work is happening.

But I hold that being able to find a place for everyone in the economy will remain a more difficult and important task for the growth of the economy than the incremental growth of productivity will be.

Tom E

June 5, 2009 06:51 PM

I had ADSL in 1998.

@Kartik said: "Before one jumps in with the example of 'Internet', note that in 1998, dial-up was the only form of Internet available"


June 6, 2009 12:29 AM

CompEng: At the risk of too much jargon, even now (i.e. slowly over 1+ decade, at a dramatically accelerating pace post-2000) we are seeing a transition from custom (ASIC) chips to heavily core/IP-based platforms, FPGA, and firmware/embedded software.

Physics (chip as well as manufacturing tech) is an ever more important dimension, but there is another problem dimension which is sheer design complexity and the social aspects/limitations of managing the *social process* of design through manufacturing, and the armies of specialized staff in an age of infinite specialization and job insecurity. (And then offshoring/multi-siting across "incompatible" timezones.)

One reason why "platforms" (for lack of a better term) and IP composition are becoming ever more prevalent is that the cost (and time!) of designing state of the art chips (in terms of functional range and complexity, as well as compliance to a multiplicity of interface standards!!) has become forbidding. The same phenomenon is driving firmware/software solutions - it is simply becoming impossible at that scale and complexity to produce chips that will actually work. You can do it once, but even a slight update or addition requires redoing much of the work. So you have to go to a higher level of abstraction (software instead of logic networks), at a cost in "efficiency".

A similar trend applies in software, and indeed in most areas of tech. Building "efficient", specific purpose oriented things from scratch has become (economically? socially?) infeasible.


June 6, 2009 01:56 AM

Tom E,

You have actually proven my point. In 1998, DSL was just starting to be offered, with an extremely small number of subscribers, at less than 1 mbps, for $60/month.

My statement stands :

In 1998, the world had just 200 million Internet users, almost all with just dial-up. In 2008, there are 1.5 billion users, and more than half have Broadband (> 1.5 Mbps).

That is progress. A lot of it.


June 6, 2009 02:02 AM

Brandon W,

It is absurd to claim that we are 'running out of trees'.

Provide an actual date : In which year will the nation of Canada see the number of trees in Canadian territory drop by 50% from today's levels?

Or even easier :

In which year will the state of Pennsylvania (a state in a high-GDP concentrated region of a very wealthy country), see the number of trees within its borders drop by 50% from today's levels?

You have to give me an actual year.

A bit of logging in Brazil doesn't move the worldwide needle much, no matter what you were told for 30 years.

Tom E.

June 6, 2009 10:05 AM

@kartik. I view the move from dial-up to broadband as more of an incremental change, whereas dial-up was more of a revolutionary change.

Except for video download, practically everything you can do on broadband, you could do on dial-up, albeit at a slower pace.


June 6, 2009 10:17 AM

Yes, in much of the electronics industry, that is the trend, but I also see that as evolutionary. The problem of design complexity is real and interesting (I work in pre-silicon validation, after all), but not a new phenomenon in the larger sense. The inefficiency of designing "large" purpose-built items (like CPUs, new jets, and aircraft carriers) will force us to get better at modularity, but it won't stop us from building more ambitious designs. But while that development path may feel new to electronics, it's not new to manufacturing. Think of the huge array of parts companies that feeds the auto manufacturing industry or the airline industry.

I don't see your observations as incompatible with mine. My only counterpoint is that the number of people required to design that IP is even now small compared to the number of people required to run a manufacturing company, so despite appearances, we still haven't hit the wall in number of people we can throw at a design problem from an economic standpoint. It just feels like it because our basis for comparisons are historical design projects.


June 6, 2009 01:41 PM

CompEng: I didn't mean to dispute any of your (technical) points. Progress will continue to happen, but the rate of *tangible results* may (and I think, will) stagnate or slow. In my reading that's also the spirit if not the letter of the original article - it acknowledges the innovation (efforts), but bemoans the apparent lack of "breakthrough" products in the "marketplace".

The main thrust of my concern has been that with "throwing more people" at complex-at-scale problems, the bottlenecks shift (extend?) from the subject matter to the social/collaborative sphere.

Many complexities are of a nature that implicit interactions of "logically" unrelated things hinder or outright prevent breakdown into fully-independent modules. In your line of work you should see enough examples of this.

In this regard, the "low hanging fruits" are the problems where that complexity or the scale of the problem is bounded. Many have been picked, and what remains are the ever more difficult ones.

The art of successful problem breakdown is to minimize the interdependencies between parts, and/or make the interdependency "manageable" - there is often leeway to do that. But even so, people/groups working on the respective parts/aspects have to communicate, and beyond a certain scale of interdependency that communication has to be hierarchical (in the sense of not fully N-to-N).

On top of managing the communication, at that point it appears virtually impossible to prevent information hoarding, the formation of power centers who can impose their control on the technical business process, politics, perception management, a general decline in transparency, and the corruption this enables. That's in essence the difference between a startup and the stereotypical "big corporation".

That's not to say it doesn't work at all, or it shouldn't be tried. We do it every day, and (incremental) progress is happening. It appears to me that today the limitations of social factors have grown disproportionately versus the technical factors, but maybe I'm just imagining that as "back then" I wasn't embedded in the workforce and didn't have that insight into the business.


June 7, 2009 12:34 AM

It pays to look at Eli Whitney. He scammed the Continental Congress with his interchangeable parts demo back before the U.S. was a separate nation. (That's when Congress started mucking around with industrial policy). It took him nearly 20 years to fulfill his contract. It took another 20 years before his son and Sam Colt actually figured out how to mass produce guns, and the taxpayers funded all of this. Then, it took another 20 years before manufacturers started adopting "armory practice" in their machining, and the industrialization of the U.S., now a separate nation, took off.

If you look at any industry, it is easy to forget the actual time scales of change, especially major change. You not only need new technologies, you need new management techniques. The transcontinental railway, another piece of government meddling in private enterprise, had been running 20 years before the railroads realized that it made sense to manage freight runs end to end, rather than having each district manager focus on getting freight out of his district.

If you look at accumulated change, you tend to see surprising differences. In the 1960s, everything was made out of metal and wood. Now everything is made out of plastic. Cars are amazingly trouble free and get incredible fuel mileage. I remember the VW Beetle getting 20 mph and being quite impressed. As a rule, children died of leukemia. Now they usually survive and for quite some time. I remember those old codeine cough medicines that left me woozy, unlike modern inhalers that knock out the cough with no side effects.

I get the impression that we are still picking up low hanging information technology fruit. We are not even close to getting the full benefits of what we can do. We are in the silver age of materials science, but so few of us think about materials that we don't even realize it.

As for the last ten years, a lot of the problem was an administration that neglected industrial policy because it didn't believe in it. All sorts of things were going on, but there was no public focus.


June 7, 2009 12:13 PM


I see your point, but I think current industries are actually pretty good at managing incremental improvement and the hierarchy and communication required to make it go. Don't get me wrong, my company is really concerned about getting improving communication and decision making processes and getting the efficiencies right there. But while those concerns about improving the rate of development are valid, they're primarily about improving competitiveness. As you pointed out, they are limiters, but development is not in immediate danger of stalling there.

Disruptive improvements, true innovations, cross-discipline improvements, things that don't have a ready niche to fill: those are the things that are threatened by the factors you point out. But as long as basic research continues to proceed, disruptive technologies will continue to bubble up until a new wave of development is enabled.


June 7, 2009 08:52 PM

CompEng, it seems we are not (fundamentally) disagreeing about anything, just perhaps looking from different angles and emphasizing different aspects.

I agree that incremental evolution is being managed pretty well, but that's almost a circular statement - clearly you make whatever progress you can make, and whatever happens to turn out, you are managing pretty well. But I think one thing the original article took issue with is precisely the rate of progress so attained.

Look at a company like MSFT. Maybe it does not say very much to pick out this or any other singular example, but MSFT is certainly a commercially successful company. However when you look at what they managed to deliver on the Windows and Office fronts post 2000 (or say post XP), that doesn't look like a success story to me, and not even incremental *improvement*. They royally screwed up, and have only their market dominance to thank that there were no adverse consequences. When we got Office 2007 at work, I was baffled with the inefficiency (much longer startup time and no indication whether the application has actually started for some 5-10 seconds on a state of the art machine) and the new UI which is presumably supposed to be more flashy, but doesn't add any new utility I can make out. I'm not going to comment much on Vista beyond saying I saw it recently on a newly purchased at least average machine, and was astounded by its unresponsiveness, even with no software installed yet.

I'm inclined to believe this is not a matter of plain incompetence of the people doing the work there, or recently emerged difficulties with the subject matter of desktop/server OS and business software, but a social problem of managing the technical work in an organization of such size, where without doubt much of the "leadership" has become complacent and disassociated from meritocratic checks and balances.

Brandon W

June 7, 2009 10:24 PM

I never said "we are running out of trees." That is wholly a different statement than the sentence I wrote, and it is a classic tactic of taking something out of context (and in this case, also changing the wording) to divert attention. We won't really ever run out of trees. We'll never run out of oil. They planetary system would collapse LONG before we reach either stage. You've missed the point. Completely. The point is about natural limits. It's about natural limits that create exponentially increasing difficulty in generating innovations. You're not stupid, but you don't pay attention and you're too wrapped up in a very narrow mindset to ever comprehend a broader discussion. You have proven that again and again. You don't have to debate my point. I don't care if you EVER debate one of my points. But if you do, at least debate my actual point and not an out-of-context snippet of a sentence I wrote. Oh, and quit pretending that tossing the "leftist" boogeyman around is anything but a diversionary tactic. It's a demonstration of your inability to think, speak, and debate objectively.


June 7, 2009 11:09 PM

Kaleberg: "I get the impression that we are still picking up low hanging information technology fruit. We are not even close to getting the full benefits of what we can do."

That's pretty much my point, stated in different terms. One can argue what constitutes a low hanging fruit, but certainly there is the problem of advances or supposed-advances not translating into widely perceived benefit. (When I mention low hanging fruit, I usually mean end-to-end, from conception to delivered utility. If the latter is not included in the consideration, then what's the point?)

"As for the last ten years, a lot of the problem was an administration that neglected industrial policy because it didn't believe in it. All sorts of things were going on, but there was no public focus."

That's right, and even more, there was an explicit paradigm of government letting the private sector (and the "invisible" hand - as if) "take care" of *everything*, in other words complete abdication of public affairs responsibility, tax cuts for passive income included. That surely has to do with the posed problem here.


June 7, 2009 11:24 PM

Kaleberg: BTW, thanks for the Eli Whitney reference. I didn't go farther than Wikipedia, but it seems the "scam" was mostly in misrepresenting the genericity of the (custom made) parts, unless he took credit for the invention, which the article doesn't indicate.

(By stating the following, I don't want to start a dispute, rather it's a perspective ...)

Whether that's a scam in the proper sense of the word is debatable, most bleeding-edge tech demos (whether in subject matter or methodology) are scripted or doctored. (You would be very ill-advised to not make sure it works, in such a high stakes situation.) That's a general catch 22, you cannot demonstrate a mature state of affairs with something that is to be developed/refined, and you generally will not get the funding without demonstrating more than you can deliver as today's production quality. It would then just be the state of the art, unless you have some kind of secret sauce.

Douglas B. Moore

June 8, 2009 06:29 PM

A key reason for "The Failed Promise of Innovation" is the near-universal misconception that it equals science and technology. When I ask seminar audiences to name an example of an innovation, no one has ever mentioned any other kind. However, this genre yields the lowest ROI. As Peter Drucker wrote, "For all the visibility, glamour, and importance of science-based innovation, it is actually the least reliable and least predictable one."

Your article reinforces this myth repeatedly, failing to note even a single one outside of this mold. To give but one counterexample, the simple idea of moving a truck body directly onto a cargo vessel resulted in an immediate quadrupling of the productivity of shipping and an explosion in world trade which was the fastest expansion of any major economic activity in history. That single Innovation Analog (SM) of bundling items to speed their passage through bottlenecks spawned a large number of innovations (e.g. PODS moving crates, the beverage fridge pack, Internet data transfer schemes) and will likely catalyze many more.

Regarding the story’s closing sentiment that "the U.S. could use a few positive Black Swans," I advise corporate clients that the easiest way to precipitate Black Swans is to purposefully go where others don’t--to the many mundane, repeatable low-tech innovation sources.

Douglas B. Moore
Principal Consultant, Moore Innovation


June 9, 2009 12:12 AM


I think the MSFT example is a nice illustration of what perverse incentives can do: I doubt any engineers are given bonuses for taking out last year's bloated features or further modularizing working code!

I guess my main problem with complaining about the rate of innovation is that it sounds like just that: complaining that someone else's progress didn't bail us out. I have trouble getting behind that attitude.


June 9, 2009 02:58 AM

Douglas B. Moore: If your example is not (an application of) science and technology, then what is it? Technology is not just widgets, but also methods, and the latter perhaps more so than the former. Science can largely be paraphrased as the structured and systematic pursuit of any subject matter.

Most progress in anything has been made by people applying structured thought to their observations (analysis), purpose driven experimentation (hypothesis testing), and deriving new principles (synthesis). And of course propagation, reproduction, and improvement of successful results.


June 9, 2009 03:03 AM

CompEng: Of course, it's always the fault of those who (are supposed to) do the
actual work. When the engineers don't deliver on the promises of Marketing or management, the engineers have failed. Or, in our example, making good on the implied promises of excessive debt issuance, to be repaid with "high level innovations" from all those "investments".

Douglas B. Moore

June 9, 2009 10:11 AM

Cm: One can make an argument for defining technology more broadly as you did. However, I do not think a fair reading of the article would attribute this definition to the author.


June 9, 2009 10:39 AM

Exactly ;)

Douglas B Moore,
Certainly process improvements tend to give you a much better bang for the buck, on average, than science and technology improvements. But no amount of process improvement would have given the ancient Romans automobiles or reduced their dependence on semi-disposable slave labor. The fascination with science is that it makes new things possible, while process improvements increase the scalability, effectiveness, or cost efficiency of what is currently possible (if not yet practical). In my view, the "myth" is that making new things possible through science is cheap. In truth, it is very valuable, but also very expensive.

I think anyone that knows anything about manufacturing is aware of the power of process improvements. However, manufacturing is not currently en vogue in America.

Douglas B. Moore

June 9, 2009 02:27 PM

CompEng: I believe you misinterpreted my position as advocating only "process improvements." What I wrote is that I advise clients to look beyond science and technology to innovation sources such as Innovation Analogs (SM). There have been many transformative innovations outside of hi-tech, such as installment buying, the textbook, the volunteer army, and the discipline of management.


June 9, 2009 03:57 PM

Douglas B. Moore,

I will agree that innovations in methods and contracts are important and probably undersold. A couple of your examples simply fall under the category of "codification of knowledge". The rest I would say do fall under the category "process improvement" because they consist of achieving a known goal with a given set of resources and information, but in a better way. I don't think calling them process improvements belittles them.

In my mind, science is about getting new information about the way things interact, so that you can make better decisions. Technology is the field of practical applications of such non-obvious principles. It does stands to reason that you ought to apply less specialized and exotic principles to your problem-solving before you go after "high technology". That's the simple principle of buying low and selling high! Just the fact that "technology" is so often used as shorthand for "information technology" is an indication of a gold-rush mentality that you appear to be challenging.

But if you want to live for 200 years, cure cancer, live on Mars, etc., well it seems obvious that you just can't get there from here unless you know more. So we need both science and good "outside-the-box" problem solving, but you're right that what we call "technology" is currently oversold.


June 9, 2009 06:30 PM


I'm absolutely with you on the over-emphasis on narrowly defined technology with regard to the kinds of innovations that can propel a society forward.

My favorite examples of the moment are: currency, publicly funded education, marriage, symbolic representation of language, matches (OK, that's technology). The common theme I see is broad accessibility and obvious benefits to individuals. Business-to-business improvements may make a few fortunes, but serious new invigoration in western economies requires empowerment and real relief in our harried and insecure individual lives, in my opinion. Yes, it could easily be surprisingly mundane.


June 9, 2009 11:03 PM

Douglas: I agree that the mainstream concept of technology is probably skewed towards the connotation of "machines and automation". But without intending to split hairs, you had framed your presentation in terms of your own conversation about technology, so technically I wasn't referring to the author's usage but yours (which I may nonetheless have misread) ...


June 10, 2009 07:28 AM

Silly article, by focusing on early-stage areas like biotech or MEMs, you're cherrypicking evidence from fields that nobody realistically expected to drive the economy anytime soon. IT should have driven the economy, it didn't because we never deployed micropayments to pay for online services. As for static wages for college graduates, maybe it has something do with the 80% increase in health-care costs over the same time period? Nah, better to just keep repeating that old trope and misleading readers. As for those commenters who blame Bush, he had as much to do with the lack of innovation as Clinton had to do with the '90s boom, almost nothing. We can only blame the sheer stupidity of those in industry and perhaps economists like Mandel, who waste students' time with dumb, antiquated topics like macroeconomics rather than giving them a good basis in microeconomic fundamentals.

Douglas B. Moore

June 10, 2009 10:53 AM

Cm and CompEng: This is not "splitting hairs" in that it impacts the question of "who can innovate?" The reflexive answer that it is largely an R&D matter is a debilitating belief. Because the majority of managers at my clients do not believe that it is an important part of their job success, there is an understandable lack of interest in learning how to innovate.

Another obstacle is that innovation is largely perceived as an art. The attempts to transform it into a repeatable discipline (or in your words, "technology") have not yet gained traction. Therefore, I commend Praveen Gupta’s comments in the parallel forum discussion as a cri de coeur to us all:

"What we need is a better understanding of the innovation process, more resources in development of the science and engineering of innovation, make it available to everybody to benefit from it, and then accelerate business innovations with improved returns."


June 10, 2009 01:36 PM

Douglas B. Moore,

You're making some good points. I agree that problem solving should never be perceived as "somebody else's job", a specialization we don't have to share. Better problem statements, better problem solving skills, and practice in redefining paradigms according to the factors you can change are all vital skills that should not be relegated to the "R&D wizards".

That said, I had perhaps misinterpreted your original assertion to say that science is not a worthy investment, which put me a bit on the defensive. Also, I think some part of the "innovative process" will remain some properties of "art" because the toughest part of innovation is the proper statement of the problem (or conversely, a properly cataloging of opportunities). An attempt to codify the process tends to create the type of "boxes" innovation is typically concerned with breaking out of. That doesn't mean we can't apply tools and processes, just that we should be aware that what semantically separates innovation from any other evolutionary improvement is the extent to which the solution to a problem conforms to a historical understanding of the problem.


June 10, 2009 10:54 PM

I agree with CompEng that innovation has a substantial "out of the box" element (the "inspiration", not the "perspiration" part). Where I take exception is the supposedly-hard distinction between evolution and innovation - it's rather subjective, and there are cases of several micro-innovations adding up to a substantial whole. An honest subject matter expert can probably tell the difference, but it's nonetheless subjective, and there is no general rule.

Likewise I doubt that "breakthrough" innovation can be moved to a factory/conveyor-belt paradigm. It can happen in a factory setting, but not on factory schedules, periodicity, and predictability.

The stereotypical innovation environment is the antithesis of (rigid) "discipline", and for good reason.

Organizations that care about innovation recognize this and provide the "breathing space" for their innovators, while keeping them close to the problem domain (embedded/in contact with customers and/or product groups). The breathing space is largely not being involved in day-to-day business execution, and in reverse - the latter will kill any breathing space.

Jeff Disher

June 18, 2009 04:26 PM

The one glaring thing that stood out, which was mentioned with almost every 1998 innovation Michael referenced that either didn’t commercialize or took very long, was the phrase “. . . . was more difficult than initially anticipated.” or some variation on that. Seems human nature in business planning and forging new paths through the wilderness of innovation is to underestimate the hurdles we will face (not knowing what we don’t know) and overestimating our ability to navigate through the unknown (assuming we know more than we do). This is usually the reason anyone gets into a predicament is poor planning and being unprepared. Is the answer to faster innovation . . . i.e., getting to commercialization sooner . . . more of a “measure twice-cut once” mindset (tortoise and hare analogy) so our investment dollars are used more smartly (do it right the first time)? The article implies we have borrowed too much to fund our “more difficult than expected” approach. Of course, we still have to be swinging hard if we are at the plate, but maybe choose our pitches a little smarter.

Thank you for your interest. This blog is no longer active.



Michael Mandel, BW's award-winning chief economist, provides his unique perspective on the hot economic issues of the day. From globalization to the future of work to the ups and downs of the financial markets, Mandel-named 2006 economic journalist of the year by the World Leadership Forum-offers cutting edge analysis and commentary.

BW Mall - Sponsored Links

Buy a link now!