Given more processing power, big advances in PCs are possible. But Microsoft's Mundie says breakthrough research and training is urgent
In the not-too-distant future, your PC will do a lot of things it probably can't do now—say, guess which file you'll use next or organize your notes for an upcoming meeting. That's if programmers can harness all the processing power that could make those scenarios come to life. And at the moment, that's not so certain, a top Microsoft (MSFT) executive says.
"The programmers of the world have never let lie fallow a computing resource of any capability," said Craig Mundie, Microsoft's chief research and strategy officer, on Apr. 30 during a dinner in San Francisco with reporters. Yet that's just what could happen unless computer and software makers, high-tech startups, and universities do more to train developers capable of programming a coming wave of ultrapowerful chips. "Without that, it will all go to waste."
No tools yet
In a speech at Microsoft's Silicon Valley campus in Mountain View, Calif., earlier that day, Mundie, who manages the company's research labs and public policy stance, told a conference of academics and venture capitalists they need to do more to educate programmers and to fund startups that can exploit the biggest change in computer programming in more than 20 years.
As chipmakers pack more and more processing "cores" onto their products—perhaps hundreds within a decade—PCs could expand their power by 50 or 100 times (see BusinessWeek.com, 2/12/07, "Intel Builds the Fastest Chip Ever"). The technology could re-energize the computing market by making desktops, cell phones, and other silicon-powered devices like TVs act more like "personal assistants," able to predict users' preferences and actions, Mundie said.
What's needed, though, is breakthrough research and more training in the programming that's necessary to tap that horsepower. It's called parallel programming and it involves instructing a system to carry out many tasks at the same time. "We haven't created either the tools or the trained people," he said. "This is really a profound issue now for the industry."
That's especially true as computer engineers try to build systems that weave PCs, cell phones, and other devices together in ways that let users share data like calendar entries, documents, music, and video across a personal network of information. In June 2006, Mundie and Microsoft Chief Software Architect Ray Ozzie took on many of Chairman Bill Gates' technical duties. They'll assume more as Gates prepares to leave daily work at the company in June, 2008.
For Microsoft, one big challenge in coming years will be drumming up demand for products as PC sales in the U.S. and other Western markets rise at single-digit percentage rates. PCs are on track to keep adding processing power as Intel (INTC), Advanced Micro Devices (AMD), and IBM (IBM) add more cores to chips. "Once you have so many cores available, the things you'll be able to do with a computer aren't the same things you're able to do today," says Intel technology strategist Manny Vara. Intel's research labs have built prototype software that can automatically edit digital video and pull the highlights from a recorded soccer game when running on multicore chips that exceed the capabilities of today's products.
Mundie's concern is that before long, the tech industry will suffer a dearth of applications that can take advantage of the new hardware in ways that resonate with consumers. "If I told you that I had 100 times more power, and I applied it to Word, Excel, and PowerPoint, would you care?" said Mundie. "The answer is no."
Martin Griss, associate dean for education at Carnegie Mellon West, who attended Mundie's talk, says squeezing more processing cores onto chips could open the door for a new breed of programs that can predict users' behavior based on what they've done in the past, the time of day or week, and clues from their calendars.
The results could be PCs that know not to interrupt you because you're working on a deadline, software that can obey rapidly spoken commands, and videoconferencing systems that can follow speakers around a room, he says. Virtual worlds like Linden Lab's' Second Life could also benefit from multicore chips so their graphics resemble sophisticated video games (see BusinessWeek.com, 4/16/07, "The Coming Virtual Web"). "It's clear there's enough power now to do simple things," says Griss. "As more can be done in parallel, you can analyze more options. Like playing [computer] chess, how deep into this tree of decisions can you go before it's time to act?"
Microsoft's new operating system, Windows Vista, includes some of this speculative ability. The software's SuperFetch feature builds a statistical model of users' behavior, the time of day, and other factors that predicts what program they're most likely to launch next, leading to faster load times.
But more can be done, including writing software that recognizes handwriting with greater accuracy or that automatically open e-mails and documents related to a meeting that happens at the same time each week. "Like a great personal assistant, the computer should move into this space I call 'speculative execution,'" says Mundie. Microsoft is working on programming tools that can let developers exploit parallel programming techniques, previously the domain of supercomputer users.
Venture capitalists are just starting to mine the area. Matt Murphy, a partner at Kleiner Perkins Caufield & Byers, led the firm's September, 2006, investment in PeakStream, a Silicon Valley company that sells programming tools that can let developers harness the power of graphics chips to speed up computing in industries including investment banking, oil and gas exploration, and pharmaceuticals. PeakStream has raised $17 million from Kleiner, Sequoia Capital, and Foundation Capital.
If Mundie has his way, that's only the beginning.