Magazine

Online Extra: Where No Computer Has Gone Before


Tetsuya Sato calls it "holistic simulation." Sato is the head guy at Japan's Earth Simulator Center, which operates the world's fastest general-purpose computer. To Eng Lim Goh, it's "planned serendipity." He's the chief technology officer at Silicon Graphics (SGI), which builds high-performance computers and visualization systems. And "ultrascale computing" is the term frequently repeated by Raymond L. Orbach, director of the Energy Dept.'s Office of Science, one of the U.S. government's main sponsors of academic research.

Basically, they're all talking about the same thing: Using supercomputers to discover new scientific knowledge. The idea is to create simulations so richly detailed that researchers and engineers may begin to understand phenomena for which conventional science still can't offer explanations. The dynamics of a manmade sun -- a fusion-energy generator -- is one famous example that physicists have been struggling to understand for decades.

Another example is turbulence. The physical laws that govern the motion of liquids and gases were laid down a century ago. Yet turbulence regularly -- and unpredictably -- develops in fluids. Turbulent flows around cars and planes increase drag, which requires extra fuel to overcome. Turbulence also occurs inside automobile engines during the mixing of gasoline and air, which makes combustion less efficient.

"TOOK THE PLUNGE." Researchers can try to minimize turbulence by tinkering with this or that, based on intuition and hunches. But Sato says Earth Simulator marks the dawning of powerful supers that can, for the first time, simulate both micro- and macro-scale physics at the same time -- and with enough fidelity to provide new insights that may lead to a scientific basis for dealing with turbulence and other chaotic conditions.

Holistic simulations, Sato predicts, will prove to be superior to traditional experiments in most fields. In April, he launched two small research programs, one focused on holistic software, the other geared to holistic hardware. Although the Tokyo government didn't cough up additional funds, Sato believes so strongly in the concept that "we took the plunge and scraped the technology team" for a few good people.

Simulations are also a top priority in the U.S. In 2001, Orbach's Office of Science launched the Scientific Discovery through Advanced Computing (SciDAC) program. One mission is to cultivate methods for creating precise models of real-world systems -- everything from the interactions of whole economies to the quantum dance of subatomic particles. Another is to foster more cross-discipline research through online "collaboratories."

SOFTWARE TWEAKS. A third target is squeezing more juice out of the software written by scientists and engineers. This is critical, says Orbach, because over the last 20 years, "half of the total speedup in simulations is due to improvements in algorithms." For example, when the National Energy Research Scientific Computing Center at Lawrence Berkeley National Lab used to run a certain fusion-energy simulation, NERSC's computer delivered less than 10% of its theoretical speed.

After fine tuning, the same software notched 68% of peak performance. Five programs have been tweaked so far, says Orbach, "and the smallest improvement we got was reducing run time from 28 days to 4."

In industry, simulations will be increasingly important, he predicts. For example, to General Electric's (GE) jet-engine business, even a 1/2% improvement in fuel efficiency would be worth a king's ransom. Achieving it might require 3 quintillion calculations. But on a computer that can notch 10-teraflops speeds continuously, the problem would be solved in 3.6 days -- at a cost of roughly $10,000 in computer time. Going the traditional route of multiple design iterations and physical prototypes, it might take 3.6 years and millions of dollars.

EXCLUSIVE ACCESS. Starting in October, the notion of discovering new science through simulations will get a series of acid tests. Three research teams will get exclusive access to NERSC's super for a week or even a whole month. "That has never happened before," notes Orbach. Normally, supercomputer capacity is carved up into shares. Two or more projects are always running concurrently, each using just a portion of the machine's processors.

Analyzing the outcomes could take months, so the results probably won't be known until next year. But interim reports will probably be available on the SciDAC Web site at www.osti.gov/scidac. By Otis Port in New York, with Hiroko Tashiro in Tokyo


Cash Is for Losers
LIMITED-TIME OFFER SUBSCRIBE NOW
 
blog comments powered by Disqus