Apple watchers assumed Jobs & Co. would be the first to offer a multitouch laptop, but Hewlett-Packard has beaten them to the punch
From the first time Steve Jobs demonstrated "the pinch"—the two-finger gesture used to zoom in and out of photos and Web pages on the iPhone—some Apple observers assumed it was just a matter of time before a multitouch-enabled screen showed up on the Mac.
That hasn't happened yet. But as of Nov. 19, Hewlett-Packard (HPQ) has beaten Apple (AAPL) to the punch, announcing the first multitouch-enabled notebook PC, the tx2. I can't help but wonder whether Apple just lost an important race.
Hewlett-Packard's tx2 is a convertible notebook, meaning its screen can pivot 180 degrees to show someone else what's on it or lie flat and act like a tablet PC. It's the first convertible notebook aimed directly at consumers. The tx2 sports HP's version of multitouch technology, which lets you use two fingers at once to manipulate images on the screen or make on-screen gestures that signify specific commands. A pinch motion works just like it does on an iPhone, letting you rotate pictures or press and drag files around. And the tx2 is priced to move—starting at $1,150, only $151 more than the starter MacBook. I briefly interacted with the machine during a meeting this week with HP.
From Star Trek to Ho-Hum
HP has been heavily promoting touch interfaces for about two years. Earlier this year the company launched its second touch-based desktop PC, the Touchsmart (BusinessWeek, 6/25/08). The tx2 is a direct evolutionary result of that. HP is also releasing a new model with a larger screen and multitouch support on Nov. 19.
Touch-enabled screens are nothing terribly new. Computers have long had the ability to accept input from a finger touched to a screen. When I take out cash from my bank, it's usually at an ATM with on-screen "buttons." Controlling computers by touching their displays was a fanciful idea in the 1980s, on TV's Star Trek: The Next Generation (BusinessWeek.com, 3/15/07). Now it's de rigeur.
Apple moved the multitouch needle—or finger, as it were—with the iPhone. The device lets you use two or more fingers at once, and Apple has several patents for how this works on the iPhone, whether it's pushing buttons, following links, selecting a picture or song with a single finger, or zooming in and out of an image with a two-finger pinch. The phone even boasts on-screen game controllers that are arguably better than those of any other handheld gaming device (BusinessWeek.com, 11/4/08).
Still, Apple so far sells only two multitouch-enabled products, the iPhone and the iPod Touch. Over the last two years, rumors have surfaced that one day Apple's notebooks—the MacBook and MacBook Pro—will evolve to include multitouch-enabled displays. Another theory is that Apple will create an Internet-ready tablet modeled after its iPod Touch—a Wi-Fi Internet device that does e-mail, browses the Web, plays music and video, and runs most iPhone applications.
Yet the only nod to a multitouch interface on the Mac has been the trackpad on the latest MacBooks. A user can, with two, three, or four fingers, make certain motions on the glass trackpad to invoke all sorts of different commands that would otherwise require a keyboard or a mouse.
Steve Jobs, Foot-Dragger?
But what's Apple got against touching the screen directly? Lineage may have something to do with it. The tx2 is a remnant of Microsoft's (MSFT) push for tablet PCs that started in 2000. Some tablets gained acceptance among certain types of business users, but they never really took off with consumers. "Fundamentally, Apple has not bought into the tablet concept," says analyst Tim Bajarin, who runs consulting firm Creative Strategies. "Bill Gates has been preaching this idea for eight years, but there's no sign that Apple has any interest in it."
Apple's apparent foot-dragging is ironic considering the company made the point that handheld devices are essentially little computers, doing away with the need for either a stylus or a keyboard. And intuitively, touch interfaces are more natural than using a keyboard or a mouse. Manipulating real objects on a table in front of us should translate well to a screen, right?
Perhaps, but given the current physical attributes of desktops and notebooks, a touch interface would be awkward. Sitting here in front of my desktop and notebook displays, I imagine that repeatedly reaching up to touch the screen a few thousand times a day would make my arms and shoulders tired. Ideally, touch-based computers would be built into the surfaces we sit at. This is still a long way off. (And I don't think Microsoft's costly, clunky "Surface" coffee-table computing concept brings us close.)
Still, if there's one company that can and should be showing the way, it's Apple. Why aren't we flipping through album and movie covers on iTunes on the screen in the same way we do on an iPhone? Or thumbing past pages of digital books and magazines on a Mac-based tablet? Two years into the Apple-inspired revolution in touch interfaces on smartphones, I'm disappointed the only computers doing that run Windows.
Business Exchange related topics:
Designing Portable Computers