Viacom's suit seeks a better way to remove copyright-violating YouTube uploads. Why don't content creators and Web sites both foot the bill?
To hear Google (GOOG) tell it, Viacom (VIA) wants to unravel the social Web. If Viacom had its druthers, Web sites that rely on user-generated content would be held responsible when users upload material that violates copyrights, Google argues in a public response to Viacom's $1 billion lawsuit accusing Google of copyright infringement.
The implication, Google argues, is that services like video-sharing site YouTube would have trouble getting off the ground. "Viacom's lawsuit challenges the protections of the Digital Millennium Copyright Act that Congress enacted a decade ago to encourage the development of services like YouTube," Google wrote in its May 26 response. "Congress recognized that such services could not and would not exist if they faced liability for copyright infringement based on materials users uploaded to their services."
Google is only partly right. It's true that doing away with certain DMCA protections—such as those that shield Internet companies from liability for distributed content—would indeed hamper many sites. After all, it's nearly impossible for companies to ensure that all the videos, photos, comments, and other content uploaded to sites don't violate copyrights. Even if such omniscient content screening were possible, it would undoubtedly be cost-prohibitive for all but the largest players.
But Viacom isn't looking to dismantle the DMCA, though its suit does point to a major flaw in the law that Web sites and media companies both must address: what to do when infringing content is taken down but then immediately put back online. "Even after it [YouTube] receives a notice from a copyright owner, in many instances the very same infringing video remains on YouTube because it was uploaded by at least one other user, or appears on YouTube again within hours of its removal," Viacom says in its complaint.
The DMCA needs an upgrade. Enacted in 1998, the law needs to reflect advances that make it simple to continually post and repost offending material. Media companies like Viacom shouldn't have to file notice after notice to the same site concerning versions of the same piece of content.
When the DMCA became law, user participation was largely confined to AOL chat rooms and peer-to-peer services. As a result, it was feasible for media companies to police their content by searching for unlicensed uploads and filing takedown notices.
Much has changed. Now users are uploading videos, photos, and audio files to video-sharing sites, social networks, blogs, and even sites owned by major media companies. Viacom's MTV.com, for example, lets users upload photos and videos. The number of people uploading content and the speed with which they do it has made filing takedown notices ineffective for media companies that produce large amounts of popular content.
In some cases, by the time a site responds to a takedown request for a popular piece of content, another slightly altered version of that clip or photo has already been uploaded. Under the current law, the media company often has little choice but to file yet another takedown notice for that slightly altered version and hope that someone doesn't simply upload the same material in the interim.
The technology exists to make additional protections for copyright owners possible. Companies such as Audible Magic, Nielsen Media Research, and Vobile all have digital fingerprinting or watermarking services that enable content creators to register content in a central database that can then be scanned by Web sites for matches to user-uploaded content. In cases where media companies have already filed takedown notices, that technology can be used to match against the removed material and keep the same infringing content, or very similar versions of it, off the site.
Sophisticated Filters Needed
"The technology is being deployed and it is fairly good," says Mark Kirstein, president of research firm MultiMedia Intelligence. The U.S. market for such technology will grow to more than $500 million by 2012, he estimates. "It is not going to prevent everything, but it is vastly superior to the alternative, which is largely the content companies monitoring content almost manually and issuing takedown notices."
Google recognizes the potential of filtering technology (BusinessWeek.com, 10/16/07) and is working on its filters for YouTube to solve the re-uploading problem.
Of course, no filtering technology is perfect. There is a risk that filters will screen out noninfringing material, such as parodies of popular content, or fail to recognize a slightly changed version of the same infringing Saturday Night Live video, for example. "If you say that content is automatically taken down when the audio track matches [registered copywritten content], that is not good enough since the video could be completely different," says Corynne McSherry, a staff attorney at the Electronic Frontier Foundation, a nonprofit digital rights group. "It may be someday that filters become sophisticated enough, but right now I don't think they are there yet."
With companies such as Google behind them, however, filters are bound to get better. Even the EFF, which supports the DMCA law as is, doesn't have a problem when filters are used to take down material that matches content for which a company has already filed a notice, without user objections. They also don't object to filters that flag possibly infringing content for human review.
The key issue in Viacom vs. Google/YouTube is not whether the DMCA should be thrown out altogether or whether it is possible to keep users from uploading the same content. Really, all the arguing is over who should be responsible for paying for the policing. Google says it doesn't have to pay under the DMCA rules as written. Google is probably right. Viacom says it shouldn't have to shoulder the cost of YouTube's success by paying to have a third party screen all of the site's user-uploaded videos. That argument makes some sense as well.
Why not make both content creators and Internet companies pay? Both sides have an interest in policing content. Content creators want to protect their copyright and make sure they are adequately compensated from the sites with which they do have licensing agreements. And, as user-generated content sites mature, YouTube and others have an interest in ensuring that their sites are well-lit places on which marketers can safely place ads without worrying about offending a partner or unwittingly supporting pirated material.
Content creators could pay to watermark and register content, and still file initial takedown notices to sites. That way, Web sites would not remove content that creators willingly uploaded, say, to hype a new show.
For their part, Web companies could pay a fee to the same third-party company to scan user-uploaded material against a library of registered content with prior takedown notices. The fee could vary based on the amount of user-uploaded material that the site needs to scan, and thus should take into account the differences between a YouTube and a new startup.
It behooves content creators and Web companies to start exploring compromise. Judges often come up with verdicts that force companies to meet somewhere around the middle anyway.