Innovation & Design

The Price of Forgoing Basic Research


The trends that have seen scientific research move away from corporate labs and commercialized academia damage prospects for innovation and growth

In the "good old days," the industrialized world was peppered with corporate research labs. At the same time, universities were generally well funded. Curiosity-driven research, a key component in innovation was the ethic of the academy. The university produced great minds that were encouraged to think deeply and creatively without the consideration of commercial relevance. Industry selected appropriate candidates from among this cohort and gave them a home where they could generate proprietary intellectual property on which that company's future could be based. Along the way, all did some outstanding basic science.

But since about 1970, we have been on a path where industry's investment in basic research has been in decline. At the same time, there has been a significant shift toward applied, "industry-relevant" research within academia. I believe that these trends do not augur well for the future of industry, academia, or society as a whole.

The Decline of the Corporate Lab

Some might argue that the decline in the number of corporate research labs is no bad thing—that the market made a correction and halted investment in things that did not provide an adequate return. I can even hear someone bringing up Xerox PARC (where I worked) as an example. "Hey, they developed the laser printer, local area networks, and personal workstations and were still not a player in personal computing!" Well, if you want to argue that a failure in a particular technology transfer is sufficient to condemn the whole notion of corporate research, then we will just have to disagree. The invention of Nylon, Lycra, Spandex, Teflon, and Kevlar provide a clear illustration of how investment in research can sustain the long-term viability of a corporation (in that case, Dupont (DD).)

Others might argue that corporate research has simply moved to other parts of the organization—to places where it can be more integrated with the rest of the company and therefore accelerate the adoption of research. They might even support such a conclusion by referring to the data reported in sources such as the OECD Science, Technology & Industry Scoreboard 2007 which indicates that, in general, reasonable investments in research and development are being made by industry, academia, and government.

But the term R&D is so broad as to border on useless for the purpose of analysis, since it covers the whole gamut of activities from basic research to product development. It ignores the significant difference between the work of a Nobel Prize laureate and a junior programmer. As early as 1980, the economist Edwin Mansfield showed that throwing everything into one R&D bucket obscured the fact that corporate investment in basic research, and even advanced development, was in decline. In this now-classic paper, Mansfield surveyed the R&D spending of 119 firms, representing about 50% of R&D expenditures in the U.S. He found approximately a 25% reduction in their investment in basic research between 1967 and 1977.

That may seem like a long time ago—but think how long it can take for research or development to play out (for more on this topic, see my previous column, "The Long Nose of Innovation"). All of a sudden, the issue becomes contemporary. As a result, we should be skeptical of reports which lull us into believing that our R&D bucket is adequately full.

There will still be those who argue that industry can no longer afford to undertake basic research and that any investment is best made in applied research and development. I would direct them to one of the more significant conclusions of Mansfield's study: that for a given investment in R&D, there is a significant and direct relationship between the percentage applied to basic research and total factor productivity. That is, the return on investment goes down as the R&D budget shifts from basic to applied research.

Academia's Faustian Bargain

Universities, meanwhile, have been encouraged by various subsidies, fund-matching schemes, and tax incentives to pursue the hope of generating mini-Silicon Valleys and incremental revenue through the licensing to industry of intellectual property resulting from their research. (In the U.S., this was given a boost by the Bayh-Dole act of 1980, which enabled universities to patent and license the results of federally funded research.) Academia has apparently been only too happy to make a Faustian bargain to redirect priorities towards the shorter-term objectives of industry.

Take, for example, the deal between Virginia Commonwealth University and Philip Morris (PM), reported earlier in the year in The New York Times. Notably, it is not the university, not the sponsored academics, nor peer review that controls what, if any, research results can or will be published. Philip Morris does. Furthermore, the original deal also insisted that the university not make any public disclosure about the terms of the contract. So much for academic freedom and open research.

The foundation of academic life is the community of scholars. This can only be sustained if academics have the freedom to talk about their work with their most knowledgeable colleagues, no matter what university they are associated with. The growth of knowledge stemming from an open community of scholarship should trump the hypothetical short-term commercial potential of any patentable idea. Yet in this Faustian world, as an academic with international stature in my field, I cannot speak to you if I have any belief that you, as an academic, may patent ideas stemming from our conversation. This is unacceptable.

Healthy universities need to understand that their primary role is long-term, basic, curiosity research. To be blunt, I believe that when academic research starts demonstrating industry relevance is when funding should be cut off, not augmented.

A New Innovation Ecosystem

It seems to me that we have lost any appreciation of what constitutes a healthy and sustainable ecosystem of research and innovation—one that reflects the dependent yet distinct natures of the academy and the corporate world. Crucially, industry needs to look past the myth that research is something it cannot afford or that only extremely large companies can afford. I funded my research program at the 500-person, 3D-graphics-software company, Alias Research, on about 1% of revenue (which was about $100 million). With larger companies, as little as 0.5% is sufficient to support a research group worthy of the name.

What we are doing is not working. The current worldwide economic crisis and the repeated cries for innovation and game-change are only-too-visible indicators of that. We are largely paying the price for policy decisions made a quarter of a century ago. Those policies are still rigidly in place. But it is not too late for change. Industry should pick up the ball or suffer the consequences, and academics should get back to long-term work.

The real question is not "Can I afford to invest in research?" It is, "How can I afford not to?"


Cash Is for Losers
LIMITED-TIME OFFER SUBSCRIBE NOW

Sponsored Links

Buy a link now!

 
blog comments powered by Disqus