The most common problem with the draft questionnaires that are sent to me is the use of leading questions. A leading question suggests the answer the survey author is looking for and often unintentionally reflects the author’s bias. As a result, the answers to such questions overstate actual support for the item being researched.
Leading questions can be subtle or obvious. An obvious leading question, taken from an event survey, is “How likely is it that you will attend the 2014 conference at our new, low entry prices?”
A subtler example, used in the text Marketing Research by Burns and Bush, is “Should people be allowed to protect themselves from harm by using Mace as self-defense?” The survey sponsor’s viewpoint is clear – yes, yes, they should.
How can you fix these questions? Before I answer that, let’s look at some other egregious examples, which will help highlight one method of fixing them.
Leading Questions about Female Presidents
In a speech to the MRA conference in 2008 touching on Hillary Clinton’s candidacy for the Democratic nomination, Kathleen A. Frankovic, then with CBS News, provided some great examples of leading questions used when asking about the American electorate’s willingness to vote for a female President:
- “Would you vote for a woman for President if she were qualified in every other aspect?” The implicit bias made explicit here was that being a woman was an aspect that disqualified a woman from the presidency! (Only 33% of respondents to this 1937 poll said they would vote for a woman.)
- The next example was a statement respondents were asked if they would agree with: “There won’t be a woman President of the United States for a long time and that’s probably just as well.” This one deserves no comment. (Sadly, 67% of respondents to this 1970 Virginia Slim poll of women agreed.)
Ironically, a quick review of CBS News polls from 2008 reveal their own leading questions on different aspects of the issue:
- “I am glad to see a woman as a serious contender for president.” (88% agreed, but with wording like that, how could you not agree?)
- “Is a woman president in your lifetime likely?” (79% under 45 said it was, and 44% of those 65 and up.)
A better question that CBS News asked was about the number of women holding political office. It offered three choices: “Would like more”, “Would like fewer”, “About right”. And that’s how leading questions can often be rewritten — make the biased choice that is implicit in the question wording explicit in a list of choices that include alternatives.
Another approach is to randomly cycle through a list. For instance, Gallup periodically asks, “If your party nominated a generally well-qualified person who happened to be [ITEMS A-H READ IN ORDER], would you vote for that person?” and then cycles through “Black”, “A woman”, “Catholic”, “Hispanic”, “Jewish”, “Mormon”, “Gay or lesbian”, “Muslim”, and “An atheist”. (Gallup reports 95% would vote for a woman, according to its most recent results.)
More Examples of Leading Questions
At that same MRA conference, Robin Pearl of Estée Lauder pointed out a subtle leading question that, up until then, I had used all the time: “What do you like about….?” This, of course, implies that there is something to like about whatever is being researched. Based on her recommendation, I now use “What, if anything, do you like about…?” I now, appropriately, get the response “Nothing” more frequently than I did in the past.
Other examples:
- “Are you aware that a homeowner’s policy does not cover flooding and that you must contact an agent to buy a separate flood insurance policy to protect against flood damage?” Thanks, I’m aware now! (No surprise 81% answered “Yes”.) Better to provide a list of features and non-features of homeowner’s policies to see which respondents believed were included.
- “Are you more inclined to invest in the stock market now that interest rates on savings accounts and CDs are so low?” A more accurate question would have asked, “How has your inclination to invest in the stock market changed compared to 12 months ago?”
- “How much do you think you can save by buying online?” A neutral question would have asked if prices were the same, higher or lower than offline alternatives and then would have asked by how much. (Ironically, literally as I was writing this post, a client sent me this question [lightly edited to remove their product category]: “How much more would you pay for the convenience of shopping online?”)
As these examples show, leading questions can be insidious.
Final Recommendations for Revising Leading Questions
Before you can remove biases like those above, you need someone with a little more distance to the subject matter to review the questionnaire – preferably someone from outside your organization, and therefore outside its typical world view.
Once you’ve identified the leading questions, rewrite them by removing the assumption expressed in the question wording.
For the first examples we used, here’s how to fix them:
- “How likely is it that you will attend the 2014 conference at our new, low entry prices?” Better to drop “at our new, low entry prices” from the question when asking it. Then test attendance likelihood at different price points.
- “Should people be allowed to protect themselves from harm by using Mace as self-defense?” A more neutral question would ask “What should people be allowed to use for self defense?” and would list a range of possible items in addition to Chemical Mace.
Whenever you read survey news stories, make sure to look out for possible leading questions. It’s a great way to train your eye so that you can write better questionnaires yourself.
You can lead a respondent to the answer, but that won’t help you make good decisions form the data.