Businessweek Archives

Virtual Reality


Cover Story

VIRTUAL REALITY

Psychologists call it "suspending disbelief." Computer jocks call it entering "virtual reality." Whatever the jargon, it doesn't begin to describe what happens in Arlington, Va., at the Institute for Defense Analyses.

You sit in a wood-paneled room as Colonel Jack Thorpe, special assistant for simulation at the Pentagon's Defense Advanced Research Projects Agency, douses the lights, flips on a computer -- and sends three five-foot screens in front of you thundering into action. Instantly, you're transported inside a tank rolling across the Iraqi desert. You are performing the same maneuvers as a unit of the 2nd Armored Cavalry during "73 Easting," an actual battle in the Persian Gulf war. The graphics on the screens are only video-game quality. Yet, the illusion works: You duck as shells scream toward you and explode in ear-splitting fury.

It isn't unusual for soldiers participating in this exercise to curse or sweat as the computer-simulated fight unfolds. Something else happens as well: Their scores for battlefield acumen improve dramatically after they practice with these video tank crews. In an era of shrinking defense budgets, such training offers invaluable experience without the cost, damage, and logistical hassle of war games. "We will expect a smaller military to be masters of a wider ensemble of skills," says Thorpe. "This is an idea whose time is right."

NEW SENSATIONS. The cyberspace tank battle is primitive compared with visions of "virtual reality" trumpeted in books, movies, and the TV show Star Trek: The Next Generation. There, intergalactic travelers use computers to conjure up Sherlock Holmes's London or a sexy date. But as DARPA's system proves, computer-generated worlds don't have to be super-realistic to evoke real life.That fact is turning virtual reality into a red-hot technology.

There's plenty of confusion over what VR is. But to most developers, the core of every system is a data base that contains data from a brain scan, specifications for a car dashboard, the description of a fictional landscape -- in short, data that can represent almost anything. A powerful computer with sophisticated graphics then renders a "world," often in 3-D, that recreates precisely what the data describe. VR displays vary widely, from images on a computer monitor to theater-style displays such as 73 Easting to projections on stereoscopic lenses mounted inside helmets that VR participants wear.

Whatever the approach, two characteristics distinguish VR worlds from other computer graphics: Increasingly, they convey multiple sensory information -- sound or touch -- to make environments more realistic. And they areinteractive. In some systems, a viewer wearing a sensor-laden glove manipulates objects in the computer as one would naturally. In others, images on the screen or a viewer's perspective are manipulated with a mouse or joystick.

At IBM's Watson Labs in Hawthorne, N.Y., for instance, an engineer seated in front of a projection screen, looking at a sleek, beige dashboard becomes a test driver for a 1997 Chrysler. Wearing 3-D glasses and a glove with sensors, he turns the steering wheel and reaches for buttons as though in a real car. Chrysler Corp. is developing the system with IBM in hopes that the exercise could cut months off the three-year to five-year car-design process by letting engineers spot inconveniently positioned knobs and other problems before they surface in expensive prototypes.

`PAST THE HYPE.' Intrigued by this kind of potential, dozens of government, university, and industrial labs, from NASA and the Defense Dept. to the University of Washington (UW), are embracing virtual reality. In the next four years, the military hopes to spend more than $ 500 million on simulations. This fall, the Army will likely award an additional $ 350 million, eight-year contract to create an advanced network for battlefield simulations. Industry giants -- including Boeing, AT&T, Sharp, and Fujitsu -- are investing millions, too. At UW's Human Interface Technology Laboratory, some 19 companies have created the Virtual Worlds Consortium to apply VR to business. "Forget the games and electronic sex," says Bryan Lewis, a researcher at IBM. "We are past the hype and pursuing real applications."

This could be a boon to computer giants such as IBM, DEC, Apple, Sun, and graphics workstation maker Silicon Graphics. VR represents a potentially big market -- and a flashy selling point -- for their muscle machines. Startups including Exos, Virtual Vision, and Fake Space Labs are building gear to enhance VR worlds -- viewing devices, acoustical chips, and sensors. Autodesk, Sense8, VPL Research, and others see their fortunes in systems that business can use.

For good reason. Cyberspace worlds that exist only in the electronic ether can be a powerful tool in the hands of architects, engineers, and scientists. They can also be used to boost productivity, improve product design, and provide more cost-effective training. In medicine, VR tools are being used to create 3-D X-rays to help surgeons plan procedures or assist in surgery miles away. Psychologists want to use the technology to treat patients and to study human behavior. Artists and entertainment moguls are pioneering new attractions -- interactive theater, interactive fiction, and even virtual sculpture, cyberspace works that defy the laws of physics.

Whether VR systems will ever match the sophistication they display in fiction is far from certain. The field faces huge technical hurdles: Success will depend on improvements in hardware and software, plus new insights into the human brain and behavior. And as systems become more "real," they will pose thorny ethical questions: Could VR influence people in pernicious ways that conventional media cannot?

Still, VR's social and economic potential seems clear. Democratic Vice-Presidential hopeful Al Gore considers VR so crucial to "the way we design new products, teach our children, and spend free time" that last year he chaired hearings on its value to American competitiveness. The conclusion: The U.S. is underinvesting in the technology.

To VR advocates, that's a mistake. Virtual reality represents "the manifest destiny for computers," asserts Eric Gullichsen, founder of VR software producer Sense8. By creating worlds of color, shapes, sounds, and feel, these systems should amplify the powers of the mind to see previously hidden relationships in complex sets of data and to absorb, manipulate, and interpret information more quickly and completely. The distinction between immersion in a VR world and analyzing the same information using blueprints, numbers, or text "is the difference between looking at an aquarium and putting on your scuba gear and diving in," says Thomas Furness, director of UW's Human Interface Technology Laboratory.

BUMP AND GRAB. Just ask engineers at Northrop Corp., who are using a VR system from Simgraphics Engineering Corp. to help redesign the Air Force's F-18 fighter jet. They model air-intake ducts on computers to make sure they fit through bulkheads, rather than building expensive hard models. An operator wearing wraparound goggles moves parts around with a type of mouse, making sure they fit together in virtual space. The software even simulates resistance, so engineers know when parts "bump" against each other. Project Engineer Robert E. Joy loves the flexibility: "It's like reaching into the workstation and grabbing the part," he says.

VR represents the second major effort in two decades to bring about a dramatic evolution in computers. The aim of the first, artificial intelligence, originally was to build systems that could mimic human reasoning, a goal that has yet to be reached. Virtual reality is the antithesis of what AI tried to do. It aims "to extend the power of the person" says Robert Jacobson, president of WorlDesign, a Seattle VR software startup.

That's what a visualization tool designed by Maxus Systems International does for managers at TIAA-CREF, a New York pension fund with $105 billion in assets. Tracking the performance of a group of stocks against the larger market is a challenge for analysts, who must follow hundreds of ever-changing numbers. Using software from Sense8, the Maxus system converts the numbers to a 3-D schematic of colored squares that move and symbolize individual stocks within grids representing market and industry sectors. It runs on a personal computer and draws on real-time feeds from financial wires.

A specialist in bank stocks may glance at the computer and notice that a box showing banks in the Pacific Rim is active. The squares are red, a signal that the stocks are falling. The analyst uses a mouse to "fly" into the lowest tier of stocks, which have plunged the fastest, and click on the security that has dropped most. Up pops text on that bank. The process takes seconds, so portfolio managers can "identify trends, recognize exceptions, and make decisions more quickly," says Sense8 President Tom Coull. "That can translate into a tremendous amount of money."

FLYING MICE. Such a system falls short for VR purists, who argue that only an immersive experience with a helmet holding two stereoscopic screens and headphones will do. That way, you see and hear only what the computer generates, interacting with the environment as in the real world. At NASA Ames Research Center in Mountain View, Calif., this approach lets you look around the surface of Mars, which has been recreated from satellite data. A motion sensor in the helmet lets you look in any direction, and the computer rerenders the scene to reflect your new perspective on the Martian landscape.

Still, theater-style simulations and two-dimensional computer displays can be just as powerful. Using a Silicon Graphics Inc. system, urban planners in Los Angeles are building an 80-block-by-80-block virtual model of renovation plans for riot-damaged areas. The value: It's hard for untrained people to read blueprints, and models are expensive. Yet, community involvement is essential. This way, residents can use a mouse to "fly" through the streets as if they were in a helicopter. And designers can pop in a park bench or delete a 7-Eleven, testing suggestions from those who live in the real Los Angeles.

The idea of using computers to render useful environments dates back to the 1960s. Back then, however, the computing power needed to generate even crude 3-D graphics was so expensive that only government agencies such as Defense or NASA, plus a few university labs, could afford it. Even today, special helmets used for military flight simulators can cost $ 1 million.

The field began to attract attention when onetime computer hacker Jaron Lanier coined the term virtual reality in the mid-1980s. In 1984, he founded VPL Research Inc. in Foster City, Calif. -- the first company dedicated to VR worlds (page 104). VPL has developed key VR aids -- head-mounted stereo screen displays, or "eyephones," plus the "dataglove" and the "datasuit," which let VR viewers convey information to computers with hand signals. Don a Dataglove, and an image of a hand appears in the virtual world, so you can point to objects, pick them up, or command the computer.

More than anything else, though, the relentless increase in performance -- and decrease in price -- of semiconductor chips is driving VR by allowing computer makers to build more sophisticated graphics systems. At the high end, Silicon Graphics' new $ 100,000 "Reality Engine" has a computing speed 1,000 times as fast as most PCs, allowing it to provide quick rendering and real-time motion in VR worlds. On the low end, desktop VR systems based on Intel Corp.'s 486 chip cost as little as $ 20,000. Richard H. Dym, general manager for multimedia at Autodesk Inc., calls new programming tools and applications for these systems the leading edge of software development.

Entertainment is one of the first beneficiaries. Nintendo Co.'s $ 99 Powerglove, a simpler version of VPL's $ 8,800 Dataglove, lets video-game wizards play with hand gestures and has already helped spawn a host of VR-like video games. Virtual World Entertainment LP's VR game site in Chicago, the "Battletech Center," has sold some 300,000 tickets at $ 7 each since it opened in July to players who sit in an enclosed cockpit to engage in Star Wars-like battles. The company has two sites in Japan and plans to open 17 more over the next three years.

`TELEPRESENCE.' In business, much VR technology will evolve out of current computer systems. Computer-aided design, or CAD, systems have been around for years. Adding VR's greater resolution and interactivity can enhance their utility, as Chrysler, among others, is discovering.

"Telepresence," a VR tool that refers to the remote manipulation of equipment, shows similar potential. The Japanese construction company Fujita Corp. has hired VPL to help it build a system that lets an operator in Tokyo direct a spray-painting robot anywhere in the world. The operator views the building to be painted on a computer, then works controls that signal the robot to spray. With VR, the image is so painstakingly exact that the human operator makes no mistakes in directing the operation.

In business education and job training, VR's chief benefit would be lower costs. The Electric Power Research Institute has teamed up with MITRE Corp. to determine if an electronic mock-up of a power-plant control room using stereo projection displays can be effective in training plant operators. Today's training rooms for fossil-fuel plants cost up to $ 1 million. Using VR, the cost might dip under $ 100,000. And eventually, says Hugh W. Ryan, director of new-age systems for Arthur Andersen Consulting, VR worlds will be used to simulate business interactions -- from sales negotiations to general management problems -- and will replace some of today's expensive seminars and classes.

VR may also help train workers for flexible manufacturing. Boeing Co.'s project manager for human-computer interactions, Keith Butler, is developing techniques to project job instructions onto see-through goggles worn by assembly workers or onto the work space in front of them. In theory, instructions presented this way could replace hours of training in which workers learn jobs, then must be trained again when the task changes. With such displays, a worker might assemble wing flaps, then switch to nose cones on the same day with little loss of productivity.

In perfecting such systems, developers must solve some novel problems. Why do some people become nauseated when navigating in cyberspace? And if you have to make a trade-off between complex, realistic graphics or live-action motion, which is more important for maintaining the illusion of reality?

The answers to such questions lie in the cognitive and behavioral sciences. Greater knowledge of the structure of the brain, how it processes information, and how people think and perceive is the key. Such research already indicates why VR worlds are so effective in training, says Roger Shank, director of the Institute for the Learning Sciences at Northwestern University. Studies show that in general, people reason or solve problems based on cases, examples, and experience, not by learning rules. "That's why the flight simulator is the best piece of educational software ever made," says Shank.

GENETIC CUES. One of the key assumptions of VR work is that the brain can process information better when itis presented through sight, sound, and touch instead of just text or numbers. Scientists also are finding that theresponses to certainvisual cues -- including hand-eye coordination and the ability to detect the edges of objects and to recognize movement across a meadow of grass -- are encoded in genes. Our cave-dwelling forebears originally developed these responses in reaction to the world around them, says Ronald M. Pickett, professor of psychology at the University of Massachusetts at Lowell.

Pickett and others are designing software icons that mimic those cues. "We want to trick the visual system to evoke quick, natural perceptual processes in the service of analyzing data," he says. To do that, he has created an icon that looks like grass. It changes length, curve, and arc to represent numeric data such as income level, age, and sex. Each icon can convey multiple characteristics that can be comprehended at a glance.

Whether people experience virtual worlds as "real" doesn't depend entirely on real-time motion, graphics, or visual cues, however. One of the most difficult challenges is to imbue computer characters with humanlike qualities. As part of that effort, Joseph Bates, a computer scientist at Carnegie Mellon University in Pittsburgh, is trying to create VR drama -- interactive programs in which computer characters and people collaborate to create stories or situations. At first, it's hard to understand how an animated landscape with four bouncing blobs could be relevant. The blobs' only activity is jumping up and down, and they are supposed to take turns "leading." But when one ball starts to dominate the activity, the others react. They change color, or slow down. One even turns from red to blue, retreating to a corner, its sides heaving, to . . . well, sulk.

The balls appear to be exhibiting emotion and acting independently because Bates and his colleagues have programmed them based on theories of behavior. These hold that emotion -- and the behavior that results from it -- arise from goals that are being met, opposed, or otherwise affected. When programmed this way, the blobs begin to act as if they have "personalities," and people can identify with them.

`BARFOGENIC ZONE.' Building on such work, researchers one day hope to populate virtual worlds with creatures -- human-looking or not -- that people interact with as they would another person. These characters might analyze a problem, monitor an experiment, or play the role of someone in a business simulation -- a hot sales prospect, say. They would probably react to voice commands but would also need to convey and understand more subtle human communication such as body language. Sound fantastic? Not to Fujitsu Ltd., which has invested $ 250,000 in Bates's work. His work reinforces Fujitsu's research in "artificial life," computer algorithms that behave like biological entities and could become the basis of computer-generated characters in VR worlds.

Fine-tuning the sensory and psychological factors that make a VR world "real" is a further technical challenge. Experience shows that VR viewers adjust to low-resolution monitors. The brain also accepts slow, jerky frame speed and much faster live action -- 30 frames per second. But in between lies what Thomas P. Piantanida, principal scientist of SRI International's Virtual Perception Program, calls the "barfogenic zone" -- from 4 to 12 frames per second. At that speed, the confusion between what the brain expects and what it sees can make viewers sick. Until computers can create complex worlds with live motion, Piantanida's work suggests that it's better to run crude displays faster than to run detailed displays in the barfogenic zone.

Putting sound to virtual worlds is one more key to improving people's ability to absorb information. "Our ears point our eyes," says NASA Research Psychologist Elizabeth M. Wenzel, an expert in adding 3-D sound to virtual environments. A military pilot, for instance, often monitors as many as eight conversations from air and ground sources through the same earpiece. Wenzel says that making the sound appear to come from different directions helps pilots key in on high-priority information. A new circuit board developed by NASA and Crystal River Engineering Inc. that produces 3-D sound will make it easier to put sound in virtual worlds. The chips mimic the shape of sound waves as they hit the human ear from different directions, creating the illusion of distance as sounds grow louder and softer.

VR researchers are opening another portal to the brain through so-called force feedback. The idea is to build weight, resistance, or attraction into joysticks, so that VR voyagers can "feel" simulated objects. Researchers at Digital Equipment Corp. are working with outside chemists to simulate the forces of molecular attraction and repulsion. Their goal is to develop a system within two years that will help chemists feel these forces as they experiment with 3-D images of molecules to develop drugs and other chemicals. That's important because molecules that appear to be compatible often are not. Knowing thisin advance could help scientists avoid blind alleys.

The more sophisticated VR worlds become, the more controversy they may generate. Some psychologists want to use VR in psychotherapy to alter the perspective of patients, or to recreate environments that cause stress or other problems as a way to help treat phobias, depressions, and schizophrenia. British psychologist Peter Ward, who plans to use VR to treat spider phobia, thinks some patients may feel more comfortable with a machine than with a human therapist.

Still, simulations with the power to make soldiers sweat might wreak havoc on fragile psyches. Indeed, widespread use of VR, some worry, could influence people in harmful ways. Could immersion in VR worlds incite violence, become addictive for some people, or lead to computer-generated manipulation of others? It will be years before anyone knows for sure. But, muses Bob Jacobs of Illusion Engineering, which develops military simulations: "We may eventually need a code of ethics for cyberspace."

In fact, a down-to-earth dilemma arose this year when a VR program helped convict a man of manslaughter in California (page 99). And some critics believe that VR training exercises could alter the view of what constitutes valuable work experience. Take two candidates for the job of nuclear-plant manager. Who should get the nod -- a veteran plant worker with a decade of no mistakes, or a less experienced candidate who scores higher in simulations of disaster? "This scares the hell out of some hierarchical types," says Michael W. McGreevy, principal engineer at NASA's Aerospace Human Factors Research Div.

BLURRY VISION. Formidable hurdles remain before VR systems can reach their full potential. "We need a whole bunch of technologies that are still in their infancy," says VR pioneer Henry Fuchs, professor of computer science at the University of North Carolina. Researchers are only making slow headway toward improving today's often blurry head displays. And a camera that digitizes the image of a room and turns it into a VR environment remains elusive: So far, computers can't distinguish between edges, lines, and shadows sufficiently to translate a video image into 3-D. It's no easy task to get so many disciplines -- programming, behavioral science, and hardware design -- to work together to produce those advances.

The task is so arduous that some VR advocates worry about being engulfed by the cycle of hype, then hopelessness, that befell artificial intelligence. Still, VR represents a potent direction in technology. Inevitably, as computers gain more power, more work will focus on making the interactions between humans and machines more efficient. Watch a roomful of charged-up players in Chicago's Battletech Center go at it -- oblivous to the real world -- and you can't help thinking that you're seeing the makings of the ultimate tool for the mind.Joan O'C. Hamilton in San Francisco, with Emily T. Smith in New York, Gary McWilliams in Boston, Evan I. Schwartz in New York, John Carey in Washington, and bureau reports


Burger King's Young Buns
LIMITED-TIME OFFER SUBSCRIBE NOW

(enter your email)
(enter up to 5 email addresses, separated by commas)

Max 250 characters

 
blog comments powered by Disqus