January 28, 2007


EXCERPT: from Words that Work by Frank Luntz

How "Words That Work" Are Created

"If you think about it, talking to a polling company is an odd way to behave. Strangers ask you to give them time and personal information for nothing so that they can profit from it."
--Nick Cohen, Sunday Observer (London)

"If I need five people in a mall to be paid forty dollars to tell me how to do my job, I shouldn't have my job."
--Roger Ailes, President, Fox News Channel

This story may get me barred from the United States Senate, but it was how I established my credibility with the toughest, most skeptical organization in America. Back in 1998, I was asked to create and then present new language on environmental issues to a meeting of the entire Republican Senate Conference. Helping members of the House is easy: They are open-minded, creative, and focused. The Senate, however, is a different animal entirely. They're generally older, uncompromising, and don't take kindly to others telling them either what to think or what to say. They also demand proof that your conclusions and recommendations are based on fact. I knew that to convince these senators that I had created the right language, I had to do something so novel, surprising, and provocative (rule five of successful communication) that even the most determined cynic would accept the results.

And so I arrived there armed with a video presentation that I knew could cost me dearly with four specific senators but would earn me the confidence I needed with everyone else. On that tape were speeches that I had written for these four senators. More accurately, I had written just one speech, and I had four senators read exactly the same text, word for word. I then had the speech "dial-tested" using a Madison Avenue technique described later in this chapter. The presentation video was a compilation of the results -- each senator's second-by-second score.

On a big screen in front of the room, the senators watched as computer-generated lines created by a focus group of swing voters rose and fell based on how those thirty individuals felt about each word and phrase. But instead of showing each Senate speech individually, I had the tape edited to show how each paragraph fared, paragraph by paragraph, line by line, senator by senator. Sure enough, it didn't matter whether the speech was well delivered or mangled. It didn't matter whether the senator had a rich southern accent or flat northwestern inflection. The senator's gender didn't even matter. Regardless of the senator or the delivery, the good language scored well and the bad language scored poorly. And so the more than forty senators in the room were mildly amused to see that their four colleagues had unknowingly delivered the exact same speech, but they were impressed and convinced that good language does well no matter how good or bad the speaker. The methodology for creating words that work passed their stringent credibility test, and I have been invited back more than two dozen times.

Here's where I need to address the profession -- the methodology -- and give you a peek behind the one-way glass and word-laboratory curtain. My editors wanted this section to be very brief: to them, how words that work are created is less important than the words themselves. But I insisted that the process of word creation is and should be just as important as the outcome. So if you are just trying to pick up the language lingo, you may want to skip this section. But if you are in the business of language, or you enjoy the "making of" DVD "extras" as much as the movie itself, read on.

Let's start with the practitioners.

It's hard to tell who is in greater demand today: the Madison Avenue branding experts who are brought in to teach political parties how to define themselves, or the political consultants brought into corporate boardrooms to teach businesses how to communicate more effectively. The tools and techniques invented on Madison Avenue firmly took hold in Washington during the Reagan years -- and they continue to drive our politics today. Similarly, more and more companies are turning to political professionals for help achieving the speed, agility, and linguistic accuracy that were once the unique province of electoral campaigns.

Pollsters and the polling they do are unnecessarily shrouded in a cloud of mystery, much of it their own making, in the mistaken assumption that the less people understand about the pollster's craft, the more the pollster can charge. The two best-known pollsters of the modern political era are Pat Caddell, who did the numbers for the Carter White House from 1977 through 1981, and Dick Morris, who became more of a general political advisor to President Clinton for most of his political career. Both men took on almost mythical proportions in the eyes of their clients and the media for their uncanny ability to translate staid numbers into vibrant political and linguistic strategy. And both men broke the first professional rule of thumb (and by the way, the term "rule of thumb" is based on an archaic rule where a husband was not allowed to beat his wife with anything thicker than his thumb) that the pollster is not the maker of public opinion but the translator of it.

Nevertheless, they forever changed the world of public opinion gathering. Caddell was the first pollster to test and turn language into a powerful political weapon, applying the art of "wordsmithing" to the science of opinion gathering. Morris, through the actual polling services of Mark Penn and Doug Schoen, was the first outside political advisor to essentially drive White House communication strategy. Between them, they applied the techniques of ongoing public opinion sampling and the application of language as an instrument of policy to create the permanent presidential campaign.

Today, polling is no longer a black art. There is a poll on every possible topic, and some Americans follow polls the way Wall Street follows the market. I am constantly amazed that the Q&A periods following my speeches across the country to various corporate and association audiences are consistently peppered with questions about some specific polling result in the news that day and its veracity -- usually asked by someone who holds a contrary point of view.

The truth is, Americans are drowning in polling numbers. National news organizations poll on a monthly or even weekly basis, and the results are given more weight, space in print, and time on air than what the politicians are actually saying. Most recently there have been times when polls about the war in Iraq drowned out the real, actual events of the day. Unfortunately, while the media have all the numbers they can possibly crunch, most surveys and their accompanying analyses are lacking in meaningful insight.

I don't seek to undermine the profession that built my home and pays my mortgage, but telephone surveys have serious limitations that most readers would acknowledge -- if they were in fact polled. The first is the increasing difficulty of getting a truly random sample of the population. The increase in cell-phone usage, particularly among those under age thirty, has made it extremely difficult to sample younger Americans (because some cell-phone calling plans charge individuals for incoming calls, it is not acceptable to poll cell phones). Similarly, the rise of "do not call" lists, the increase in unlisted phone numbers, and a general unwillingness of some Americans to answer questions from a stranger are all challenges that pollsters have to overcome every day.

Another problem with telephone polls, and Internet surveys as well, is that Americans don't want to respond yes or no to alternatives that are either unacceptable or require clarification. In the context of today's political environment, there are too many shades of gray, too many "Yes, but what I really think is . . ." attitudes, too many voter priorities that cannot be ranked and explained over the phone. You can test a few words or slogans, but after about fifteen minutes, the respondent will stop responding. Internet surveys have an even shorter patience threshold before respondent fatigue sets in.

Even more problematic is the ordering of questions. Opinion pollsters know full well that where they ask a question within the survey exerts tremendous influence on what answers they receive. If a pollster has just spent fifteen minutes with you on the phone, grilling you about the frustrations of dealing with your HMO, and then closes the survey by asking you to rate the importance of health care reform against a host of other issues, you're far more likely to pick health care as highly important than you would be if it had been the first question in the survey. Likewise, laying out a new corporate pension policy to your employees will generate a strikingly different reception if you've first explained to them that the current policy is bankrupting the company and will lead to layoffs.

And even if the ordering of questions is correct, too many polls report what voters or consumers think without explaining how they feel -- and why. They measure thoughts and opinions, but they don't provide a deeper understanding of the mind -- and the heart. Feelings and emotions are what generate words that work.

That's why I am a committed disciple of focus groups in general and the "Instant Response Dial Session" in particular. A focus group is often nothing more than a formal discussion for ninety minutes or two hours with eight to twelve people who have similar backgrounds, behaviors, opinions, or some other commonality. Madison Avenue has been commissioning focus groups for more than half a century, and virtually every aspect of every major new product launch will involve a dozen or more of these sessions. Political researchers were slower to apply the value of face-to-face discussions to politics, as they are somewhat less profitable and somewhat more labor-intensive than traditional telephone surveys.

Focus groups have been much maligned by the media as a rogue science, designed to learn how to obscure and/or manipulate. True, they do have their limitations, most important among them the scientific inability to project the results of a discussion with two or three dozen people to a population of thousands or millions. They are reflective of the people in the session, not the total population.

But a well-run focus group is the most honest of all research techniques because it involves the most candid commentary and all of the uncensored intensity that real people can muster. As in telephone polling, focus groups begin by gauging respondent awareness and superficial opinions and attitudes. But unlike telephone polling, the superficiality is then stripped away, revealing deeper motivations, associations, and underlying needs. The interaction between a professional moderator and the participants encourages more honesty and less pandering, while measuring the intensity of opinion as well as individual motivation. That's where you'll find the words that work.

A well-run focus group is a laboratory for social interaction and word creation -- yet it is one of the most obscure components of audience research. The composition of the focus group must be arrived at scientifically and statistically, and most Americans will never be invited to participate simply because most Americans don't qualify.

Posted by Orrin Judd at January 28, 2007 2:58 PM
Comments for this post are closed.