The Intel researcher is designing computers that can recognize gestures and objects to better bridge the virtual and real worlds
Beverly Harrison's kitchen countertop is a lot smarter than yours. Place a steak and a green pepper on it, and the surface lights up with a recipe that integrates both ingredients. Don't like the kitchen's suggestion? Sweep it over the edge of the counter as if the recipe were a pile of potato peelings.
Harrison, a researcher for Intel (INTC), designed the setup because she thinks there's a better way to interact with computers than simply "logging into the box," as she characterizes the current state of affairs. The kitchen array consists of a camera suspended above the counter. It sends images to a small, out-of-sight computer that recognizes certain objects and hand gestures. When it spots a food, a palm-sized projector beams useful data—such as cooking instructions—onto the counter. The system is an attempt to design behind-the-scenes computers that bridge "this big disconnect between living in the virtual world and living in the physical world," she says. Harrison's group recently finished a new version of the setup that animates scenes around Lego sets.
Harrison, who holds a PhD in engineering from the University of Toronto and spent 10 years in labs at IBM (IBM) and other companies before landing at Intel, doesn't expect her employer to enter the kitchen appliance market any time soon. Intel is, however, interested in diversifying away from the PC chip business, which accounts for more than 90 percent of its sales. One way to do that is to more naturally integrate computers into peoples' lives, says Harrison. Her gesture-driven computers may be particularly useful in situations where it's impractical to use a keyboard, mouse, or monitor. For instance, some workers have to wear clean-room suits while handling sensitive, high-tech components such as silicon chips or photovoltaic cells. The bulky protective gear makes it difficult to type or manipulate a mouse. With one of Harrison's set-ups, a worker could instead use simple hand gestures to interact with a computer, or trace numbers and letters on a countertop to input data.
Intuitive human-computer interactions like this may eventually become commonplace, says David Wu, an analyst at GC Research in San Francisco. Work such as Harrison's "will make the computer a much more natural companion for human beings," he says. He notes that new game technology, such as Microsoft's (MSFT) Kinect for Xbox 360, already relies on natural gestures and motion-sensing equipment to power game play.
For all her work dreaming up ways to more closely integrate computers into daily tasks, Harrison leads a largely computer-free existence when she's not at work. "I go home and I have one TV, one stereo," she says. "And I read a lot."
Spent 10 years working in industrial labs at companies such as IBM
A setup that recognizes foodstuffs, then suggests recipes
Computers that rely on gestures instead of keyboards and mice