At the 2013 AAPOR conference, researchers from Westat and the IRS reported on the effect of offering multiple survey modes on response rates. The research was implemented with the IRS Individual Taxpayer Burden survey, which measures the amount of time and money spent on recordkeeping, tax planning, professional tax help, and completing returns.

In 2010, the researchers ran an experiment of a paper survey vs. a sequential multi-mode survey, in which respondents were first invited to take a web survey, then sent a paper survey if they failed to complete the online survey. (In another test, not reported, some participants for both experiments were offered incentives, and the rest weren’t.) The paper-only survey had a higher response rate, at 42.3% (n=12,509) compared to 39.4% (n=4,210) for the sequential multi-mode survey. Even more disappointing for hopes of shifting the research to web surveys, only 14.9% of those invited took the survey via web, with 24.4% completing it via the subsequently mailed paper survey.

These findings were similar to that of Messer and Dillman, who tested a web-plus-mail sequential design that provided “a respectable response rate with a high proportion” responding by web but with the response rate for mail outperforming that of the web.

The alternative to sequential multi-mode surveys are concurrent-mode surveys, where respondents choose the mode they want to use to respond. Many studies have found that offering respondents a choice of modes decreases the response rate:

  • 2001 – Griffin, Fischer, and Morgan reported that multiple modes offered in a mailing depressed the response rate.
  • 2009 – Gentry reported that offering participants a choice between a paper diary and an online diary decreased the response rate by 4.2 percentage points.
  • 2012 – Medway and Fulton reported that offering a web alternative to a mail survey lowered response.

 

This is the paradox of mode choice – “Allowing respondents to choose how they complete the survey makes it more likely they won’t complete the survey at all.” Medway and Fulton had three possible explanations for this behavior:

  1. Some respondents attempt to complete the web survey but are unable to do so for technical reasons.
  2. Some respondents decide to use the web alternative but never actually log on.
  3. “Simply being asked to make a choice entails an additional cognitive burden that functions to deter response.”

 

Given this, but given the goal to decrease costs by increasing reliance on the web, the 2011 Individual Taxpayer Burden survey increased the number of contact points from 4 to 6, sending the paper survey 3 times, each with web instructions. The survey was designed to not present the web mode as a choice. Instead of saying something like “You have the choice of completing this survey by mail or online,” the web option was presented as an alternative instead:

­

IRS Taxpayer Burden booklet

A call-out on the front cover said, “Want to take the survey on the web? See back cover for instructions.” The back page began with “If you would prefer to complete the survey on the web, you may do so by following the instructions below. Web responses are processed more quickly and will help ensure that you don’t receive follow-up contacts.”

The 2011 survey was not an experiment, so its performance has to be contrasted with the 2010 survey:

  • 43.4% responded to the sequential-mode protocol, with incentives
  • 47.6% responded to the paper-only protocol, with incentives
  • 49.6% responded to the concurrent-mode protocol, with incentives (all 2011 participants were offered incentives)

 

Unfortunately, only 2.7% of participants completed the survey by the web.

So the paradox of this paradox of choice test is that by not presenting the mixed-mode survey as a choice the response rate wasn’t decreased – but few took the IRS up on the offered choice.

Even for the IRS, mixed-mode surveys are a taxing problem.

 

References

The AAPOR 2013 presentation “Multi-mode survey administration: Does offering multiple modes at once depress response rates?” summarized research conducted by Jocelyn Newsome, Kerry Levin, and Pat Dean Brick of Westat and Pat Langetieg, Melissa Vigil, and Michael Sebastiani of the IRS Office of Research. Their cited references follow.

  1. De Leeuw, E. (2005). To mix or not to mix data collection modes in surveys, Journal of Official Statistics, 21, 233-255.
  2. Fricker, R. and Schonlau, M. (2002). Advantages and disadvantages of internet research surveys: Evidence from the literature, Field Methods, 14, 347-367.
  3. Gentry, R. (2009). Offering Respondents a Choice of Survey Mode: Use Patterns of an Internet Response Option in a Mail Survey, Paper presented at DC AAPOR Web Survey Methods Workshop, September 10, 2009.
  4. Griffin, D., Fisher, D., Morgan, M. (2001). Testing an Internet Response Option for the American Community Survey, Paper presented at the American Association for Public Opinion Research, New Orleans, May 2001.
  5. Medway, R. and Fulton, J. (2012). When more gets you less: A meta-analysis of the effect of concurrent web options on mail survey response rates. Public Opinion Quarterly, 76, 733-746.
  6. Messer, B. and Dillman, D. (2011). Surveying the general public over the Internet using address-based sampling and mail contact procedures, Public Opinion Quarterly, 75, 429-457.
  7. Olson, K., Smyth, J., Wood, H. (2012). Does giving people their preferred survey mode actually increase survey participation rates? An experimental Examination, Public Opinion Quarterly, 76, 611-635.

Author Notes:

Jeffrey Henning

Gravatar Image
Jeffrey Henning, IPC is a professionally certified researcher and has personally conducted over 1,400 survey research projects. Jeffrey is a member of the Insights Association and the American Association of Public Opinion Researchers. In 2012, he was the inaugural winner of the MRA’s Impact award, which “recognizes an industry professional, team or organization that has demonstrated tremendous vision, leadership, and innovation, within the past year, that has led to advances in the marketing research profession.” In 2022, the Insights Association named him an IPC Laureate. Before founding Researchscape in 2012, Jeffrey co-founded Perseus Development Corporation in 1993, which introduced the first web-survey software, and Vovici in 2006, which pioneered the enterprise-feedback management category. A 35-year veteran of the research industry, he began his career as an industry analyst for an Inc. 500 research firm.