By Otis Port
It's time to say goodbye to Gordon E. Moore, who will soon step down as chairman emeritus of Intel Corp. (INTC) at age 72. Moore isn't as famous as Bill Gates or Steve Jobs or even Andy Grove, who replaced him as chairman of Intel. But he pioneered the microprocessor, managed the rise of Intel, and helped create the culture of Silicon Valley. Along the way, he gave us Moore's Law, the seminal insight that microchip power would double every 18 months while prices plummet.
Indeed, it has been a remarkable ride--one that began in 1956, when Moore, not long after earning a PhD in chemistry, signed on at Silicon Valley's first semiconductor company, Shockley Semiconductor Laboratory. It had been founded in Palo Alto, Calif., the year before by William B. Shockley, leader of the Bell Labs team that, in 1948, had invented the transistor. Moore quickly realized that the company's prospects were dim, and in 1957, he marched out with Robert N. Noyce and six others--dubbed the Traitorous Eight by Shockley.
They started Fairchild Semiconductor Corp. with backing from Fairchild Camera & Instrument Corp. in Syosset, N.Y. Two years later, Noyce built the first silicon chip with more than one transistor; today's chips have millions. To make these marvels in volume, Noyce and Moore developed the equipment and processes now common in the semiconductor industry. But when they couldn't persuade Fairchild Semi's old-line owners to make stock options available to engineers and other employees, Noyce and Moore, then 39, jumped ship once again, founding Intel in 1968. Their raw entrepreneurialism--and willingness to risk all on a new company when unhappy with the constraints of the old--set a pattern that would become a Silicon Valley hallmark.
From the start, the pair created a culture that turned Intel into a hotbed of innovation. In 1970, it invented dynamic random-access memories (DRAMs), the main data-storage chips in computers, and a year later, the microprocessor--originally designed as the guts of a Japanese calculator. Intel's creativity stemmed largely from the belief, carefully nurtured by Moore, that mistakes should be tolerated--and even rewarded. People who don't make mistakes, Moore often asserted, aren't taking big enough risks--and those who do stumble become more valuable for the lessons they learn.
Not all was smooth sailing. After Moore took over as CEO of Intel in 1975, the company launched, of all things, a digital wristwatch. It was a disaster and was pulled from the market after just three years--and $15 million in losses. It was money the young company could ill afford to lose. The huge market for microprocessors that personal computers would create was still years away. Meantime, Japanese chipmakers were starting to attack Intel's bread-and-butter DRAM business. From then on, Moore would wear a Microma--"my $15 million watch," he often quipped--to remind himself of the blunder.
By 1981--the year IBM introduced its first PC, which used an Intel microprocessor--Intel was losing so much money on DRAMs that its future was shaky. Moore kept Intel in the game, in part by selling IBM a piece of Intel, providing the capital to develop the processors. In 1985, Moore bowed out of the DRAM business, betting the company's future on microprocessors. That proved an inspired move. Intel today is the world's biggest and richest chipmaking enterprise--and its success has made Gordon Moore the fifth-wealthiest American, worth an estimated $26 billion.
Moore, who turned the CEO's reins over to Andy Grove in 1987, hasn't always been wild about being better known for Moore's Law than for laying the foundations of Intel's success. In the end, though, he's proud that Moore's Law eventually became a self-fulfilling prophecy: It has driven the industry to seek out the technologies needed to make his 1965 prediction come true. He knows Moore's Law can't hold for four more decades; transistors will get as small as is physically possible before then. But it could remain the guiding light for perhaps another 20 years. Gordon Moore may be stepping down, but his name will continue to be invoked with each new leap in the digital revolution. Senior Writer Port has covered the microchip industry since 1977.