Real-world devices with gesture-reading abilities are set to debut next year
Walk into Building 99 on Microsoft (MSFT)'s Redmond (Wash.) campus, and you'll be greeted by a dark-haired receptionist named Laura. She isn't a real person. She's a digital creation peering out from a computer screen. Yet she talks and interacts with visitors. She responds to a nod or shake of the head, as well as other gestures and voice commands. The technology behind Laura is one example of how companies are looking to free consumers from keyboards and remote controls in favor of more natural interaction with computers, TVs, and mobile devices.
Laura isn't going on sale at your local electronics retailer anytime soon. But the enormous popularity of Nintendo (NTDOY NTDOY)'s motion-sensing Wii game console and Apple (AAPL)'s touchscreen iPhone has many people predicting that mainstream electronics controlled by gestures and voices will become common in the near future. "We want to improve the user experience by using gestures and natural language," says Eric Horvitz, a Microsoft researcher.
The technology will start to be incorporated into TVs, PCs, and other devices next year. Japanese TV maker Hitachi (HIT) aims to become the first major brand to sell sets that respond to gesture commands. It's part of a broader effort to reach people who are put off by increasingly complex gadgets.
The idea of devices that respond to gesture dates back to even before HAL, the computer gone awry in the 1968 film 2001: A Space Odyssey. Only now, though, has technology advanced to the point where gesture-based systems are moving into the mass market.
The biggest breakthrough? Improved cameras that can detect how far away an object is from the lens, making interpretation of movement far more accurate. While the cameras are still a bit expensive, prices are expected to fall enough that it will soon be possible to build them into TVs and other electronics at low incremental cost. Then consumers could sit in a chair 10 feet from a TV and wave a hand to change the channel; during a PowerPoint presentation you could rotate your hand clockwise to raise the volume on a laptop across the room.
Gesture technology is new enough that experts define the market in different ways. Some count only products that people can control without touching any buttons or screens. Others include touchscreen products, which require some contact. The market for screens capable of sensing touch is expected to expand fivefold by 2013, to $500 million, says analyst Roger L. Kay at Endpoint Technologies.
Phil McKinney, chief technology officer at Hewlett-Packard (HPQ)'s Personal Systems Group, says the company has high hopes for gesture technology. Later this year, HP plans to incorporate more gesture features into its TouchSmart PCs, though it won't specify what those will be. The ability to use gestures "significantly removes the overall intimidation factor of people using the product," says McKinney.
As costs come down, more uses for gesture-based systems are likely to emerge. Camera-equipped devices could help caregivers determine if elderly patients have fallen or whether they can respond to questions with a wave or a nod. In schools, the interactivity of gesture-based systems could help keep children from getting bored, McKinney says.
Gestures and sounds do present challenges. One is regional variations in language, local dialect, and culture. Americans point using a single finger, for example, while that's considered rude in parts of Asia. Also, consumers will have to learn new ways of getting products to do what they want. Despite the hurdles, says Microsoft's Horvitz, "we're starting to weave what we've learned together to make this work."