Is U.S. dominance in science and technology staring to wane? It's actually not a new question. So let's begin by flashing back to the early 1990s. Remember then how mighty America seemed to be stumbling? Remember all the hand-wringing about technologies like the VCR -- invented in the U.S. but commercialized by Japan? To the gloom-and-doomers, the handwriting was on the wall. U.S. inventors would make brilliant breakthroughs, but other countries would be the first to market. U.S. consumers would then be forced to buy these home-grown technologies from Japanese and European competitors.
And the VCR was just the beginning. Worried U.S. politicians and agency officials drew up lists of critical technologies that foreign competitors were going to control. Flat-panel displays. Semiconductors. Supercomputers. The list was long and depressing.
Japan was thought to be particularly acomplished. After all, it was much more focused than the U.S. was. It had a central authority, the powerful MITI (Ministry of Economy, Trade & Industry), which targeted key technologies, nurtured their development, and helped Japanese outfits win the race to market. Japan's industry was patient. It was capable of looking ahead 5, 10, or 20 years, and it could invest in the long-term R&D needed to bring new ideas to fruition.
HYSTERICAL IN HINDSIGHT. By contrast, the U.S. system of innovation was disorganized, chaotic, and impatient. Yes, the government spent billions on research. But outside the military, much of this money went to basic research. It was up to industry to do the hard work of turning ideas into products. And American businesses weren't up to the task of making VCRs or other key new products. Indeed, no one seemed capable of doing R&D for the long haul.
That's why there were calls to set up a MITI-like system in the U.S. The semiconductor industry begged Congress for a crash multibillion-dollar program (from taxpayers) to create new generations of chips and chipmaking equipment. The Defense Advanced Research Projects Agency poured money into U.S. flat-panel makers in a doomed effort to keep them in the race. And the first Bush Administration and Congress set up a program, dubbed the Advanced Technology Program, that was supposed to fill the huge perceived gap between invention and commercialization.
How hysterical it all seems now. And how lucky Americans are that the government didn't heed all those calls. Chances are that the money would have been wasted, in an echo of the synethic fuel debacle of the 1970s. MITI turned out to be not just fallible but also an albatross, sending Japanese industry down many technological blind alleys.
History's powerful lesson: When it comes to a nation as a whole, the best system to foster innovation is disorganized and chaotic. As in the U.S., no central authority in government or industry should decide what the future ought to hold and choosing R&D winners and losers. A better way is simply to have a climate in which ideas are allowed to flourish -- and in which the market is free to pick and choose. The U.S. has plenty of bold entrepreneurs who are willing to take flyers on crazy-sounding ideas -- and the capital to back them. All of which raises the following questions:
So, overall, the U.S. is still strong?
Unquestionably. America is the world's R&D powerhouse. Overall spending jumped dramatically in the last few years of the 20th century, rising from $169.2 billion to $265 billion, the largest increase for any six-year period in the nation's history. While much of that growth was in industry, Uncle Sam's spending still dwarfs that of any other nation. The EU spends about 1.9% of GDP on R&D, compared with America's 2.7%.
And the U.S. benefits from its decentralized R&D system. With some exceptions, the main mission of agencies like the National Science Foundation or the National Institutes of Health is to nurture ideas that bubble up. Yet plenty of opportunities also exist to develop those ideas in both industry and government. The net effect: The U.S. has the world's most productive system for planting countless seeds -- and fertilizing the few that blossom.
That sounds like there's nothing to worry about.
Well, not quite. Some major trends are cause for concern. The first is that the rapid rise in R&D spending of the late 1990s has slowed significantly -- even declined in some areas.
Government spending has been largely flat for the last four years, at about $88 billion. Some areas have received massive boosts. The NIH budget doubled in the last five years, soaring to some $27 billion. So it's no surprise that the U.S. is the world's leader in biotechnology.
Funding for other hot areas, such as nanotechnology and bioterrorism research, is up dramatically. But that means big drops in other areas. There's widespread concern that basic work in areas like solid-state physics is getting short shrift.
Industry R&D investments have slowed as well. The overall figure is an impressive $181 billion annually -- but that represents a decline over the last four years. And the real drop in U.S. R&D is larger than it appears. Just as companies have outsourced many other jobs, they've shifted R&D offshore. Thousands of scientists toil at 700 research centers throughout Russia and the former Soviet republics, pushing outward the frontiers of knowledge using dollars from U.S. agencies and companies.
Scores of U.S. outfits have set up labs in India or elsewhere in Asia. And at the same time, they've eliminated or downsized many of the big U.S. labs -- such as Bell Labs and Xerox PARC -- that once were powerful engines of innovation.
These changes come at a time when other nations are boosting their own science and technology capabilities. In Korea, scientists have become leaders in controversial areas like cloning, while Britain is forging ahead of the U.S. in stem cells. China is making a bold bid to become world class in everything from biology to semiconductor R&D. Indeed, it plans to produce 350,000 newly minted engineers annually, compared to the 100,000 who now graduate each year in the U.S.
The upshot: More and more ideas will be generated in labs outside of the U.S.
So won't that threaten U.S. technological leadership?
It could. But the key question isn't where the ideas or breakthroughs come from. It's which countries will benefit the most from them. "Intellectual dilution is inevitable," explains Greg E. Blonder, a former Bell Laboratories scientist who's now a venture capitalist. "Whether we want to outsource R&D or not, the reality is most of it is going to occur outside the U.S."
In the future, he adds, "the U.S. can only count on making at most one in five inventions." After all, he points out, "there's only a quarter of a billion people in the U.S., vs. 6.5 billion or 7 billion elsewhere. And the people in China and India are disproportionately interested in doing engineering and science -- much more so than we are in the U.S. So you've got to figure the odds are against us coming up with all the good ideas over time."
That's not necessarily a threat, however. Remember that some of the overseas research is being funded by U.S. industry. The fruits of that work should flow back to the U.S. -- and at a lower cost than doing it here. America benefits from the resulting boost in productivity. And the R&D spending helps raise local standards of living, creating more consumers for U.S. goods.
It's also hard to argue that more research -- wherever it takes place -- is bad. Like everything else in the global economy, ideas and innovations are fungible. The real key is creating the right economic and business climate for spotting and developing ideas (wherever they spring from). As long as America is a land where dreams can be chased and realized and where failure is allowed, it will stay atop the technological heap -- and be a mecca for many of the brightest minds around the world.
Is that likely to remain true?
Perhaps not as much as Americans would like. Clearly, as R&D opportunities increase elsewhere, ambitious young researchers have less need to emigrate to the U.S. And American polices aren't helping either. Increasing hassles with visas for graduate students and young researchers are slowing the flow of bright young minds. "Applications from international students for advanced science and engineering degree programs are down by a third compared to last year," warns Donald F. Boesch, president of the University of Maryland. "If this continues for a long time, it will mean trouble for U.S. competitiveness."
So what should we conclude?
When it comes to R&D, the U.S. system is fundamentally sound. Yes, it would be better if spending began to increase again. But right now, no evidence exists of any serious problem.
Yet at the same time, sweeping trends may forever alter the landscape. More and more, ideas and inventions and even new technologies will spring up on foreign soil. It's possible that overseas competitors will take advantage of those inventions and push the U.S. into a technological decline. It's also possible that U.S. businesses will be quick to pounce on the new ideas -- and that the powerful American economy will provide the most fertile environment for the development and commercialization of inventions.
The lesson of the past is that it probably makes a lot of sense to worry. But it's also well to remember that over the past several decades, predictions of the demise of American's tech prowess have been greatly exaggerated. By John Carey in Washington, with Otis Port and Adam Aston in New York