Small Business

Beware the Advertising Pretest


Some of the most successful campaigns flunked out in pretesting, which still can't predict well how ads will be received in the real world

It may seem odd that anybody would have a problem with pretesting an ad or marketing campaign. It's hard to argue with the potentially money-saving (and mistake-preventing) insights research can provide. And in theory, pretesting makes total sense. The problem is that the science of advertising pretesting just isn't there yet.

A few years back, Advertising Age ran a story about the uncertainties of ad testing. It cited Volkswagen's (VOWG) popular 1997 "Da, Da, Da" Golf commercial, which some in Volkswagen management didn't want produced. The article went on to say that a handful of Volkswagen commercials were evaluated using GM's (GM) custom-designed pretesting system. According to the story, the successful VW commercials flunked under GM's process.

The Artificial World of Testing

Why can't ad testing systems always predict real-world success? There are many reasons, but I believe they essentially boil down to this: It is impossible to replicate in a research situation how somebody will respond to an ad on a Sunday afternoon while sitting in their easy chair munching on nachos and watching the game.

When people are invited to participate in market research, whether it's an online survey, a focus group, or even an in-home study, the circumstances will change the subjects' behavior. They know they're being watched, and they may even believe their job is to be critical.

Respondents On the Hot Seat

Think about how a focus group works—people are invited in, fed a meal, and paid an incentive to offer insights and opinions that the sponsoring marketer can use. The pressure is on to contribute something of value. For someone to admit that they simply like an ad or to admit that it might influence them to buy something is rare. Instead, participants tend to understate how much they are affected by advertising and be overly critical of the ads themselves.

But the desire to contribute isn't the only problem. Even if people in focus groups wanted to give an honest opinion, they may not be able to. People just aren't able to articulate or even understand all the ways advertising affects them.

Marsha Lindsay, a graduate lecturer at the University of Wisconsin and a member of the executive committee of the American Association of Advertising Agencies, explains the problem this way: "Copy testing and other research based on explicit learning cannot accurately predict ads' success because consumers can't tell us 'the truth' about how ads affect them. That learning often lies buried in their subconscious."

The Shock of the New

Stanford psychologist Robert Zajonc suggests that the more people see the same thing, the more they like it—but that people often don't initially like rare or unfamiliar things. Commenting on Zajonc's research, Bruce Tait of Fallon Brand Consulting says, "If brands are to succeed, they need to be based on differentiated, unfamiliar brand strategies. Unfortunately, these are the exact same ideas that people initially dislike. That's why quantitative testing of alternative positioning ideas will likely systematically kill the more original ideas, and people will prefer the ones that are closest to what they already know. The marketer using this type of test will unwittingly select the strategy that is less differentiated and eventually fail in the marketplace."

By contrast, consider what the people behind some of the marketplace's most successful—and beloved—advertising have to say. Scott Bedbury, the former worldwide advertising director at Nike (NK E), says, "We never pretested anything we did at Nike, none of the ads. [Dan] Wieden [the founder of Wieden & Kennedy and I had an agreement that as long as our hearts beat, we would never pretest a word of copy. It makes you dull. It makes you predictable. It makes you safe."

Indeed, being creative is by definition being different, and being different is risky. Goodby, Silverstein & Partners is one of America's most accomplished advertising agencies. John Steel, the agency's first director of strategic planning, summarizes his experience with pretesting this way: "I recently put together a reel of advertising…including "Got milk?" Polaroid, Isuzu Rodeo, Norwegian Cruise Line, and others…All were extremely effective in building the client's business. Yet all could easily have died in creative development research had consumer comments been listened to literally, creatives not been allowed to express their differing opinions, and the client in each case not had the courage to say, 'I hear what they are saying, but I will not change my mind about running this advertising as a result.' "

Research as a Compass

Because of the limitations of the science of advertising research, the only truly reliable form of testing is the real world—in a test-market situation, for example, or by using tracking studies. Even these have their limits, and the results must be interpreted carefully. But at least these methods are based on what happens in real time, in the real world—not on what respondents think might happen.

At my firm, we love research. The insights it can provide are invaluable, and they often lead us to creative breakthroughs. But we use research as a compass, not a map. We use it to explore, not to decide. We would love nothing more than to discover a way to prove ahead of time how an ad is going to be received in the marketplace. But the science just isn't there yet.

Research can tell you a lot of things, but it can't predict the future. As Tait puts it, "Statistical reliability is not the same thing as the truth."


Too Cool for Crisis Management
LIMITED-TIME OFFER SUBSCRIBE NOW
 
blog comments powered by Disqus