George Heilmeier, chairman emeritus of Telcordia Technologies, is a walking encyclopedia of U.S. innovation. Now 69 years old, he has played a major role not only in crucial information-age technologies -- such as silicon chips and the Internet -- but also such heady military stuff as stealth warplanes.
He kick-started his career by developing the first practical liquid-crystal display (LCD). His flat screens paved the way for laptop computers and wall-hanging TVs. He practically breathed LCDs in the 1960s while doing research at RCA Laboratories, a TV pioneer.
Then he earned a civilian's brass hat by managing research in the Pentagon and later heading the Defense Advanced Research Projects Agency. Among DARPA's achievements during his mid-1970s regime was a method of precisely tracking submerged Russian nuclear submarines.
Moving on to Texas Instruments () in 1978, Heilmeier helped lay the foundation for TI's future with the 1982 invention of the digital-signal processor (DSP). TI is still the world's top producer of DSP chips, which are the guts of digital cameras, modems, cordless phones, and cell phones.
With that experience in communications technologies under his belt, Heilmeier in 1991 took the CEO's job at Bellcore Corp., then the research arm of the Baby Bells. During the 1990s, as the Internet was picking up steam, Bellcore played a prominent role in the evolution of the Net and the overall telecom business. SAIC, a major R&D outfit, acquired Bellcore in 1997 and renamed it Telcordia.
Just before Heilmeier flew to Japan in November to be honored for his LCD innovations with a Kyoto Prize -- Japan's counterpart of a Nobel Prize -- he talked about his career with BusinessWeek Senior Writer Otis Port. Here are edited excerpts of their conversation:
Did you imagine right from the start that LCDs would become hang-on-the-wall TVs?
We always knew exactly what we wanted to do with them: build big flat-panel displays. In fact, flat-panel TV was the holy grail of the whole TV industry at that time. That's why RCA immediately classified my project in 1964.
Were there many experimental flops along the way?
Yes, of course. But that's often the way you learn. Successes usually don't teach you as much as failures. They just confirm that what you already knew was right.
I remember showing a prototype LCD to Vladimir Zworykin, who's often called the father of black-and-white TV. He was an honorary VP at RCA Labs. He asked me how we had come up with so many electro-optical breakthroughs. I told him that we stumbled across some of them. "Stumbled?" he said. "Perhaps -- but to stumble, you have to be moving."
What's especially memorable from all the work you led at DARPA?
Well, not the least was the first stealth aircraft, called Have Blue. It led to the F-117 Nighthawk. [DARPA started work on Have Blue in 1976, a year after Heilmeier took over at DARPA.] Have Blue showed us that we could design a plane with the radar cross-section of a sparrow. [On a radar screen, the plane would appear to be the size of a sparrow, so most radars would never spot it.]
Have Blue's first flight was on December 1, 1977. It was an early-morning flight, to make sure there were no Soviet satellites overhead. The plane used up a lot of runway before it lifted off. Later, with Have Blue back in the hangar, Kelly Johnson broke out a huge bottle of champagne that he had ordered flown over from France by an SR-71. [C.L. "Kelly" Johnson ran Lockheed's Skunk Works when it hatched the SR-71 Blackbird spy plane, still the fastest jet ever built. It flew at three times the speed of sound. Johnson had retired in 1975 but wouldn't have missed Have Blue's maiden flight.]
After a round of toasts, Kelly told me, "You ought to take that bottle as a kind of keepsake." I told him he had to sign it first. He did, and I did -- and it has been in my study ever since.
DARPA and the Defense Dept. have pulled back, somewhat, from long-term and high-risk research. Do you worry the U.S. is sacrificing its technological future?
I think that's a misleading generalization. What has changed is that there are no more easy subsidies for university research. It's like the situation when I joined DARPA. Some university researchers used to be on a DARPA dole, essentially. DARPA just kept handing money to the same groups, year after year.
I changed that. I told them I wanted to support only the best ideas, that it was unacceptable to expect funding without good ideas. They finally got the message. I think a similar revisit is going on today.
But what about the October report from the National Academies ["Rising Above the Gathering Storm"] that calls for substantial federal increases in long-term research? It's just the latest in a long string of similar reports.
I'm not certain you can make a good case for substantial increases. One reason is that the actual execution in research areas has been greatly improved. When I first went to TI, we'd take a long period of time to create a new chip: design it, build some prototypes, test them, find errors, revise the design, and build new prototypes -- often again and again.
Today, nobody does that anymore. We do modeling and simulations, and things work the first time. This enables us not only to do things faster and better, but also to do things we could never have dreamed of doing before.
Look at Intel's () Itanium or any of the microprocessors developed over last decade. They're just far too complex for that old approach. The design would be out of date before you could market it.
That's applied research and development. What about basic research?
I don't think basic research is all dead. It's just that there's a premium on good ideas, because costs increase in tandem with complexity, and all technologies are becoming more complex. But you can use the same tools -- modeling and simulation -- to evaluate proposed research at much less cost than before.
For U.S. competitiveness, that does cut two ways, I admit. The glass is half full because we can do a better job of sifting out the best ideas. But the glass is half empty because the same thing is going on in Europe and Asia, too. So there's more competition -- and better competitors. But that's ultimately good for the world of science as a whole.
How important is broadband to U.S. competitiveness?
Very important -- for many reasons. Broadband communications are essential for transporting the huge data files used in today's computer models and simulations -- and for research teams to collaborate across disciplines and organizations. We need to be careful not to fall too far behind the Koreas and Japans in providing broadband everywhere, not just between the big research institutions.
The trouble here in the U.S. is the hypemeisters who are calling for all-IP [Internet protocol] broadband networks that would carry not just data and computer graphics, but also voice traffic and TV. That raises a real quality-of-service issue, because what you would do to improve data transmission is just the opposite of what you'd do to improve peer-to-peer TV.
How do we get around that?
More bandwidth certainly helps. It eases part of the problem. But more study on the quality-of-service issue is still needed.
You were a big booster of applied artificial-intelligence research at DARPA. Is AI finally starting to measure up to your expectations?
Not starting to -- already has. AI has been a great success. Actually, there are lots of successes. But you may not know that because the successes aren't called AI. They're called data mining, design optimization, expert systems, dynamic factory scheduling, and lots of other names. But AI is the cornerstone.