Technology

Supercomputing for the Masses


Advances in chip design are prompting companies to take advantage of the potential power. But the necessary new markets and software are lagging

During the last technology boom, Dan Reed, a longtime supercomputer researcher and tech policy expert, stayed put in a professorship at the University of Illinois and managed its National Center for Supercomputing Applications (NCSA), even as the fabled lab hatched Netscape and helped set off the explosive growth of the Internet. During the dot-com bust that followed, Reed hunkered down in another faculty job at the University of North Carolina.

Now the computer industry is poised for a second transformation, in which supercomputing technology is trickling down to corporate data centers and desktop PCs, supplying them with unprecedented power. This time, Reed isn't missing out. On Dec. 3, he became the latest high-profile hire in a stable of supercomputer scientists that Microsoft (MSFT) is assembling in Redmond, Wash., to study how technology that has been the province of top-flight universities, government research labs, and a few huge corporations can transform everyday computing. "We have an opportunity to rethink issues at a deep level," says Reed, 50. "I said if the surf was up again, I was going to grab my board."

Bye-Bye Joysticks

The waves look inviting for others, too. Inspired by advances in chip design that will likely keep the performance of today's already powerful computers arcing steadily upward for years, tech companies are devising new scenarios in entertainment, engineering, product design, and medicine to take advantage of the potential power. IBM (IBM), the developer of the world's most powerful supercomputer at Lawrence Livermore National Laboratory, also supplies processors for graphics-intensive video game consoles from Microsoft, Sony (SNE), and Nintendo (NTDOY). Sony's PlayStation 3 features an IBM chip called the Cell that's also used in huge supercomputers that can easily fill a basketball court.

Intel (INTC) and Advanced Micro Devices (AMD), the two biggest suppliers of computer chips, have managed to cram four so-called processing cores onto their products. That's the equivalent of strapping four PCs together and jamming them into the space of a large envelope—and not much thicker. But the chipmakers don't expect to stop there. They plan to have dozens of processors on silicon chips within a decade. Marshalling all that power could open doors to new ways of interacting with machines.

Intel has talked to console video game makers about using chips that can perform in excess of 1 trillion calculations per second (BusinessWeek.com, 2/12/07) in future products that use cameras to track body motion to control the action, instead of using buttons or joysticks. "We imagine some future generation of [Nintendo's] Wii won't have hand controllers," says Justin Rattner, Intel's chief technology officer. "You just set up the cameras around the room and wave your hand like you're playing tennis." Intel missed out on supplying chips for the current generation of game systems, and is trying to gain a foothold there.

A Dearth of New Markets

But what may dash the dreams of Intel and other hardware makers is a lack of inexpensive, off-the-shelf software to bring supercomputing to the masses. For now, these sophisticated machines require equally sophisticated and, in many cases, custom-developed programs tended to by highly paid engineers. That's why Microsoft is building a brain trust and handing out funds to schools doing work in the field. The software giant is underwriting grants to universities to study how supercomputer-style programming can be applied to personal machines. The company also is building data centers to serve up its new Live online software that a few years ago would have been more at home in a research lab.

Meanwhile, Hewlett-Packard (HPQ) on Nov. 13 delivered a new class of mini-supercomputer, designed for small engineering and biotech companies, that costs around $50,000. Callaway Golf (ELY) is an early HP customer. And IBM, Google (GOOG), and Yahoo! (YHOO) have launched initiatives in "cloud computing" (BusinessWeek.com, 11/16/07), harnessing supercomputing power for new Web-delivered software for applications like modeling risk in financial portfolios, generating computer graphics, or understanding conversational typed or spoken queries.

But creating new markets predicated on making supercomputer performance mainstream isn't a slam dunk. There's no clear path to the kind of kind of high-volume markets, measured in the hundreds of millions of units sold, that the tech industry counts on to fund its advances. And efforts to apply high-performance computing power to a broader set of problems could be stalled by a dearth of widespread programming knowhow (BusinessWeek.com, 5/2/07).

"Is this whole infatuation with performance something that has moved beyond what the vast majority of users really care about?" asks Intel's Rattner. "Are there really a set of applications that require 10, 100, 1,000 times the performance we have today? And if we have it at an attractive price point, will it drive high volumes? It's still to be determined," he says. "There are still people who question whether the volume markets are there for all this performance."

Intel is pondering these questions as it prepares to celebrate the 60th anniversary of the invention of the transistor, a mainstay of its products, at Bell Labs in 1947. Intel's most recent quad-core chips contain about 820 million transistors, which amplify electrical signals and let current flow.

Programming Advances Required to End Customization

High-performance computing has been a growth market for tech companies, even as demand for more traditional business systems has ebbed. The market for high-performance servers alone reached $10 billion in 2006, and grew 18% in the third quarter of 2007, to $3 billion, according to market researcher IDC (IDC). "The real growth in revenue has been at the bottom half of the market," as segmented by price, says Ed Turkel, a manager in HP's high-performance computing division. Distributing work across dozens of processors to speed performance "will ripple down to PC desktops as well," he says. "We're at the front of that technology curve."

Combined with advances in graphics processing and the spread of high-speed Internet connections, researchers envision more immersive online worlds with realistic graphics, and "personalized information spaces." These information spaces will feature computers that can track conversations, anticipate the needs of participants, and pull up any documents relevant to the discussion. They're also exploring systems that can quickly compile magnetic-resonance-imaging data in a doctor's office so patients don't have to wait days for results, or allow architects to see instantly how changes to a concert hall design would affect acoustics. "This is a real sea change that's happening in computing," says Thom Dunning, Reed's replacement as director of the NCSA, which lends its supercomputer center to companies and universities for research projects.

But if the computer industry hopes to apply supercomputing techniques to products designed for millions of users, it's going to have to match the performance gains in hardware with significant advances in software and programming techniques. Most software companies—and their customers—can't afford the expense of custom designs, which is the way most supercomputers are used today. "For the mass market, you can't count on customization," says the NCSA's Dunning. "It's far too expensive. We're keeping our fingers crossed that Microsoft, Intel, AMD, and other companies will provide some incentive."


Hollywood Goes YouTube
LIMITED-TIME OFFER SUBSCRIBE NOW

(enter your email)
(enter up to 5 email addresses, separated by commas)

Max 250 characters

Sponsored Links

Buy a link now!

 
blog comments powered by Disqus