In a cabana in Progreso, Mexico, overlooking the blue waters of the Gulf, Canadian Chuck Dueck cracks open his laptop and logs into the comment forums of several news websites. Over at cbc.ca, home to the Canadian Broadcasting Corp., an article on child obesity has drawn this gem, “It is VERY simple. People who are FAT eat too much. There were no fat Jews in Auschwitz—they did not have much food. Stop eating so much!” At npr.org, one comment is directed specifically at Dueck. “GO F- -K YOUR SELF A- -HOLE, You are making me hate this site!!! F-G!”
One by one, Dueck, a professional online moderator, deletes these comments, scolds the people behind them (either on the forum or over e-mail), and, if things really get out of hand—say, in the case of repeat offenders—bans their accounts. Over the course of each day he chips away at the cussing and swearing, the spammers, haters, and trolls, temporarily restoring civility to his corner of the Internet.
Since the first messages were posted on bulletin boards some three decades back, comments and free discussion between anonymous users have been a central part of the Internet’s appeal. Sites such as Gawker and the Huffington Post built their empires on page clicks driven by endless streams of commenters and flame wars. But what’s good for Gawker isn’t always great for established brands, and as companies have embraced the Web and eagerly interacted with their customers, they’ve often been overwhelmed by the response. A lethal combination of anonymity, opinion, and the safety of typing from a remote location all but guarantees that comment forums get out of hand, falling prey to the Hobbesian tirades of the Web’s most nasty, brutish, and vocal denizens—hence, the increasing need for moderators such as Dueck to intervene and sanitize sites’ comment boards.
Dueck works for ICUC Moderation, the brainchild of Winnipeg businessman Keith Bilous, which started out in 2002 as Captain Interactive, broadcasting text messages onto nightclub screens (after vetting the content). Today, ICUC employs over 200 moderators globally and was acquired in June by London’s Aegis Group (though Bilous, like all his employees, still works from home). The company claims $10 million in revenue last year, cleaning up the comments on the websites, Twitter feeds, and Facebook pages of blue-chip brands such as Chevron (CVX), Starbucks (SBUX), and the Boston Globe. “Some Fridays you feel like you need to spend two hours in the shower because it’s so disgusting,” says Bilous.
“We see the dark underbelly of the world,” says Tamara Littleton, the CEO of London-based eModeration, a 160-person community management firm with $7 million in revenue whose clients include MTV, the Economist, and ESPN (DIS). The firm has doubled in size each year since it began in 2002 (also as a text-to-screen nightclub gimmick), and charges clients anywhere from $4,000 to $50,000 a month for moderation. “It used to be a lot about keeping things clean, safe, and legal for brands. All they wanted was people not to say horrible things,” says Littleton. “Now it’s about engagement…. Now you want to manage Facebook pages and Twitter accounts.” She notes that while the social networks don’t allow for anonymity in comments, they’ve increased her company’s workload tremendously, as consumers demand instant responses from brands online. Littleton cites an incident last year when Nestlé PR people tried to stifle criticism from Greenpeace on their Facebook page, which was not professionally moderated. The event unleashed a torrent of comments and resulted in a PR disaster. In such cases, eModeration’s team might have defused the situation before it blew up.
Although escalation training and sensitivity are parts of the job, comment moderation requires unusual equanimity. “It’s art, not science,” says Caterina Fake, co-founder of Flickr and a former community manager for various early online comment forums. Moderators are largely middle-aged and well educated. Most work remotely, on flexible schedules. “Ours tend to be women over the age of 35 working from home, sometimes in addition to other jobs,” says Peter Friedman, who has been hiring comment moderators since he set up Apple’s (AAPL) internal social network in 1984. Today he’s CEO of LiveWorld, an $8 million online community management provider with 200 to 400 active moderators working at any time for clients like Pfizer (PFE) and Bank of America (BAC).
Jessamyn West, a Vermont librarian, initially volunteered to moderate comments on MetaFilter and is now one of two full-time paid moderators on the site. “I can work with a community to make a model of good behavior.” Most of the time, that means jumping into overheated conversations and reminding users that “we don’t call each other a- -holes.”
Moderators (or “mods,” as some call them) can earn anywhere from $40,000 to $80,000 annually, but need to be prepared for daily exposure to humanity at its vilest. Extreme racism and bigotry, images of pedophilia, and even personal threats are all too common. Littleton, who has even had her home address and phone number posted by disgruntled commenters, makes sure new recruits undergo extensive background checks. “You need good common sense, and you need a really thick skin,” she says.
The strain can take its toll. Although nasty comments make up less than 10% of what appears online, according to Littleton, the bad apples are what moderators are paid to deal with. A significant number of new hires with ICUC last less than two weeks. To cope, moderators work on sites in short shifts, flipping between forums prone to maliciousness (news stories about Israel, say) and something more joyful (LEGO fan pages).
Sometimes comments escalate to the point where law enforcement is called in. Friedman recently contacted authorities when threats against President Obama appeared on a website discussing Home Improvement reruns. Fake and other moderators once helped stop a case of human trafficking. After a MetaFilter user commented that two Russian girls he’d met online were going to interview for jobs at a bar, another commenter noted that the bar was a known brothel, and the girls were alerted. And both LiveWorld and eModeration have successfully intervened with police to stop attempted suicides. In one incident, cops raced to a house provided by moderators from the commenter’s IP address, breaking down the door as the individual was tying a noose.
Many companies, like The New York Times, still moderate their own websites, but the costs of employing full-time moderators can add up. Often, a brand will simply abandon control of their comments, as was the case with npr.org, which hired ICUC last year to manage all its commenting. Others have taken a more lighthearted approach. In 2008, Deadspin, Gawker’s sports blog, embraced its reputation for vicious commenters and let the site’s moderator, Rob Iracane, write a short-lived column about the current state of the site’s comments.
Moderation outsourcing services will continue to grow, predicts Jeremiah Owyang, an analyst specializing in online customer service with the Altimeter Group. But, he says, “expect emerging markets like India and the Philippines to offer services to brands direct, with retrained call center staff.”
“There’s a huge surge of companies coming into this market offshore” charging clients $5 an hour, vs. the $30 to $40 eModeration does, says Littleton. But, she warns, moderating a conversation requires more than just a list of swear words in a native language and a spam filter. “We want our moderators to help our members navigate and have empathy for the community,” says Tina Sharkey, president and chief executive of babycenter.com, one of the largest international maternal sites. BabyCenter’s 10 in-house moderators are all mothers, recruited from the site’s chat rooms. “It’s a much more authentic experience when someone’s coming in to be part of the conversation, not just an observer with their nose against the glass.”