Magazine

Attack of the Killer Dust


It's a nightmare scenario: A group of brash Silicon Valley scientist-entrepreneurs, racing against a deadline and desperate to strike it rich, field-test a swarm of micro-robots the size of dust particles in the lonely desert around their Nevada factory. Designed under a military grant, the 'bots are supposed to be the latest in battlefield surveillance. But they quickly morph into predatory, self-replicating swarms.

Actually, the swarms issue from the keyboard of author Michael Crichton, creator of Jurassic Park and other tales of science-run-amok. Slated for release on Nov. 25, the latest novel, Prey, taps into society's fear of scientists too entranced by their creations to fret about the consequences. And, as in his previous works, Crichton has made a shrewd choice in Prey: He mines the budding field of nanotechnology--the science of the extremely small. "We're in a social environment that's tremendously disposed to new things," Crichton said in an interview. "But there always is a downside."

The timing of Crichton's novel is both good and bad. It arrives at a juncture when prominent scientists are debating the potential hazards of nanotech research. Yet the plot pales beside the real specter of biowarfare in the post-September 11 world. Compared with the known horrors of a smallpox epidemic, Crichton's sentient, micromechanical swarms feel like the far future--which means real scientists may find the author's treatment ludicrous. But timing aside, the book could serve an important function, prompting the public to contemplate the potential hazards of this new field. "These technologies have risks," acknowledges Neil Gershenfeld, director of the Center for Bits & Atoms at Massachusetts Institute of Technology. "There is great value in taking stock of them."

How implausible is Crichton's latest scare story? Briefed on the plot by BusinessWeek several weeks before Prey went on sale, scientists on nanotech's frontier were mostly dubious. But serious researchers have been kicking around scenarios that resemble Crichton's for years. The latest round of hand-wringing commenced when Sun Microsystems Inc. (SUNW) co-founder Bill Joy published an essay on the topic in Wired magazine in April, 2000. In his piece, the renowned software engineer warned that the convergence of nanotech, genetic engineering, and robotics could trigger unprecedented disasters.

Some scientists ridiculed Joy's thesis at the time--but Joy has not backed down. If anything, his views have hardened into a stance that radically challenges how science is now practiced. "We have to limit access to the kinds of scientific information that lets people do these kinds of things in laboratories," Joy declared in a TV interview in October. "That will require regulation of materials and information in ways that we haven't had to do before" (page 104).

Threatening or not, nanotech is fast emerging as a fruitful confluence of science and engineering. Once considered blue-sky research, the field has produced such milestones as carbon nanotubes a few hundred millionths of an inch thick that function as the world's tiniest transistors. The U.S. government is pumping $604 million into nanotech research this year--up 30% from 2001--and will raise the ante by 18% in 2003. Biologists, chemists, software engineers, and venture capitalists are flocking to nanotech seminars. Over the past year, venture funds have committed about $437 million to 46 nanotech deals, according to Venture Reporter.

Even this outpouring of funds, though, won't take the technology to the level Crichton fancifully depicts. Where fiction and science part ways is on the topic of self-replication. The scientists in Prey use cell cultures to breed their nanobots. But Crichton also throws into the mix "assemblers"--robots no larger than a handful of atoms that construct themselves and other objects atom by atom. The term, popularized in the 1980s, is a red flag to skeptics because the concept is flawed, say Harvard University chemistry professor George M. Whitesides and chemistry Nobel Prize laureate Richard Smalley of Rice University. Even assuming such 'bots could be created, they would never be able to place atoms precisely where they were required, argue both men.

In the opposing camp, computer scientist and inventor Ray Kurzweil and others say the doubters haven't adequately accounted for the role powerful computers will play in solving such problems. Even fringe nanotech concepts could benefit from trends such as Moore's Law, which holds that the computing power of microchips doubles every 18 months. "People say we're not going to see self-replicating nanotechnology entities for 100 years," Kurzweil notes. "But we're doubling the paradigm shift for every decade, so we'll get there in 25 calendar years."

Of all the diverse nanotech projects under way, the ones that most closely evoke Crichton's drama are lab experiments on "smart dust." Kristofer Pister, a professor and director of the University of California at Berkeley's Sensor & Actuator Center, is working on such devices, which are now about the size of sand grains and contain sensors. His ultimate goal is to create dust-size particles fitted with silicon intelligence, radios, sensors, and possibly wings. Such grains could be scattered through buildings--or over a battlefield--to monitor the environment and human activity.

Watching such developments, Crichton takes a sanguine view. "Even though I write these Crichton stories, I'm quite optimistic," he says. Kurzweil also remains undaunted by the downside. "If we do half as well as we have done with software viruses, we will do well" in nanotech, too, he predicts.

Even amid all the contention surrounding this field, scientists seem to agree that Crichton's scare scenarios play a useful role. Under the public's watchful eye, researchers are more likely to proceed with caution--and continue to develop technologies that will help protect us from ourselves. By Heather Green in New York


Cash Is for Losers
LIMITED-TIME OFFER SUBSCRIBE NOW
 
blog comments powered by Disqus