Live streaming video still has its fair share of hiccups, but the industry is working to ensure the supply of bandwidth keeps up with demand
If you've been on the Web in recent days, you've no doubt noticed that CBS is pulling out all the stops to cash in on March Madness. Unlike a year ago, when games were only shown to registered CBS users, all 63 games of this year's NCAA basketball tournament will be available for live streaming to anyone (BusinessWeek, 3/20/08).
And for the first time, CBS is letting hundreds of other sites—including Facebook—feature its coverage as well. "We expect [viewer] growth of 50% from last year," CBSSports.com General Manager Jason Kint was predicting before the tournament began. He was wrong. On the first day of play, 1.7 million people logged on—more than doubling 2007's traffic.
That sounds terrific if you're a CBS (CBS) executive—but a nightmare for the techies who have to make it all come off as glitch-free as possible. For the past six months, they've been carefully planning for the flood, reassigning many of the 600 servers from CBS's football fantasy league, and negotiating deals to crank up capacity with network partners that cache the games on thousands of servers at thousands of locations.
Better Service for Net VIPs
Even with all that, CBS has hedged its bets by only promising fast, smooth delivery to the 500,000 "VIPs" who preregistered. Others may have to wait minutes for the video to start. "We'd rather have a great experience for 500,000 users" than have everyone go away unsatisfied, says Tony Fernandez, CBSSports vice-president for technology.
Welcome to the new world of massive-scale Net media. First came the bouillabaisse of eclectic prerecorded YouTube videos. With that, the world's media companies began racing to cater to the online addictions of the MySpace generation, offering many of the most popular TV shows online and creating new ones specifically for the Web. Now comes the advent of truly gargantuan online events.
Just ask Oprah Winfrey. In early March, many of the 700,000 viewers who tuned in for the first in a 10-part series of highly promoted Webcasts encountered glitches almost from the start. The video slowed, dropping out of sync with the voices. "It got unwatchable," says Margie Backaus, an executive with Equinix who went home early that day to log on. "After 15 minutes, I said, 'I'm out.'"
Hoping for a TV-Like Experience
Oprah's production company quickly fixed the glitches, and two subsequent shows came off just fine. But such slip-ups raise a bigger issue: how to keep up with the soaring demand as Net video goes mainstream? The basic math is daunting. Even garden-variety video takes up vastly more bandwidth than e-mail and Web surfing, and there's little doubt that demand for high-definition quality will soar as the growing number of consumers with HD-TV sets and cable programming come to expect a TV-like experience on the Net.
When you click on a Web link or send an e-mail, a few seconds' wait is no big deal—and doesn't tie up any bandwidth until your next click. Not so with video, which requires a "persistent" link to ensure no delays. Worse, video tends to be viral, creating "flash crowds" of people all wanting to view it at the same time.
And that's just prerecorded content. Live events are far trickier, as is anything that involves interactivity—whether it's serving up ads targeted to particular viewers, or making sure a video game player isn't obliterated due to technical difficulties in an online multiplayer competition. Such content can't just be cached out somewhere at an Internet node near you; each user's clicks need to talk more directly to the central server.
Comcast Embroiled in Controversy
Naturally, the industry is on the case, pursuing a host of strategies and technologies to make sure the supply of bandwidth never falls too far behind demand. For starters, carriers such as Verizon (VZ) and Cablevision (CVC) are spending billions to expand the bandwidth into people's homes and cell phones, while installing ever-bigger routers within their networks. AT&T (T) and Verizon Wireless, for example, just spent a combined $16.3 billion in a government auction for additional radio spectrum that can be used to deliver high-speed wireless Internet services (BusinessWeek.com, 3/20/08).
Even so, experts say it's unlikely these carriers could ever afford to install enough of the extra gear to satisfy demand for bandwidth. As such, they're using software tricks such as dynamic bandwidth allocation, reducing one user's capacity while he's browsing eBay (EBAY) to give his neighbor more bandwidth as he catches up on the latest 24 episode. And Comcast (CMCSA) has gotten itself embroiled in controversy by using technology to impede a certain kind of direct "peer-to-peer," or P2P, traffic between users. According to Cisco Systems, such P2P traffic chews up roughly 60% of consumer online traffic.
Another part of the solution will undoubtedly be so-called content distribution networks (CDNs) that offload video, music, ads, and other fare from the central servers. Akamai Technologies (AKAM), the king of this business, currently handles about 15% of all online traffic. With 30,000 servers installed in roughly 1,000 networks around the world, Akamai doles out everything from iTunes songs to all of those March Madness games.
BitGravity Offers High-Definition Quality
But new rivals insist Akamai's distributed approach is best suited for serving Web pages and photos—not interactive videos. Limelight Networks , which is handling the Oprah series, has just 7,000 servers, but they're far more powerful.
Rather than tying up network bandwidth by constantly resending big video files to Akamai's lower-capacity machines, Limelight's servers have the space to hold all of its customers' content. This cuts down on bandwidth costs and provides a more direct connection from a content owner to a viewer's PC. It's worth noting, however, that Limelight recently lost a patent-infringement suit to Akamai, which is due $45 million in damages and will seek an injunction prohibiting Limelight's further use of the offending technology. For his part, Akamai CEO Paul Sagan is nonplussed by rival's claims: "We've spent 10 years getting ready for the broadband explosion," he says. "Our original patents were all about video and rich media."
Meanwhile, a host of startups are pushing the state-of-the-art. BitGravity uses even fewer, but more powerful servers than Limelight. It promotes its ability to offer HD quality, and fans say its technology could enable entirely new interactive experiences. A sports fan, for example, could choose to view a game from a variety of camera angles. In the years ahead, TV watchers might be able to call up various panels on their big screen—maybe the Super Bowl taking up most of the screen with an ESPN.com feed and a video chat with a friend on the right. "If the Internet is to become a mainstream distribution vehicle for video, I think [BitGravity] will be a major player," says Blake Kerkorian, CEO of SlingMedia, which makes devices that send TV shows to laptops and other devices.
Israeli startup Arootz is taking yet another approach, blasting the most popular shows and video clips to TiVo-like storage devices in customer homes using Internet multicasting (BusinessWeek.com, 6/27/07). Distributing content in advance—and during off-hours—promises to lessen the burden on networks during peak times.
New Respect for Peer-to-Peer Technology
Another surprising entrant may play an even more crucial role in easing the video traffic jam: peer-to-peer technology. Until now, P2P has been despised by the music and film industries, because it's often used to share pirated tunes and flicks for free. And broadband providers have demonized P2P as a virulent parasite that commandeers more than half of their bandwidth.
Rather than deliver clips from a central server, P2P slices the video into tiny chunks that live on millions of users' PCs. Whenever one user requests the clip, it's reconstructed and delivered from points all over the world. Last year, to prevent such usage from slowing its customers' connections, Comcast decided to throttle back traffic from a P2P program called BitTorrent, sparking howls of protest and government inquiries.
Now, some major forces are rethinking this antagonistic view of P2P. Verizon helped formed a group called P4P to collaborate with its former rivals, and the initial results look promising. Verizon thinks it can boost download speeds by about 60% if its customers' computers, rather than searching all over the globe for bits of a video file, are directed to limit the search to one another. "This is just an example of what can happen when the carriers work with us," says Gilles Bianrosa, CEO of P2P company Vuze.
No doubt, consumers will suffer plenty of frustrations as Net video goes mainstream. "Every time we reach a new frontier—a million simultaneous users, or 5 million, or 15 million—you're going to see glitches," says Mike Gordon, chief strategy officer of Limelight. But, he adds, "They'll get fixed, and we'll move on. We've got to."