A common challenge when writing closed-ended questions is enumerating all the most common choices. Getting this wrong can often lead to bias.

Alternatively, a crowd-shaped choice list adapts to reflect consumer responses. It can start out like a conventional choose-one question, presenting a list of options chosen by the questionnaire author and an open-ended “other” response, or it can start out with just an entry field. Unlike in a traditional online survey, where the “other – please specify” responses are simply stored in the database, every choice typed in by a respondent is added to the list. Respondents choose from selections entered by earlier respondents.

Over time, only the most frequently selected choices are presented. To protect against choices entered the earliest from always being selected, the displayed choices can be chosen probabilistically based on Bayesian scores. A choice is given a seed value, which is decremented each time the choice is not selected when presented and incremented each time the choice is selected. The “other” text box can use autocomplete to suggest the other, undisplayed choices that respondents have entered based on the initial letters the respondent has entered.

Do crowd-sourced choice lists remove the potential for bias from standard, scripted choice lists? No, not at all, as the answers entered by early respondents may dramatically shape subsequent selections and as choice sets may make subtle differences between some choices but broad differences between others. Additionally, from a reporting standpoint, it’s important to track and show the percent of times a choice was selected of those times it was presented, as well as the percent of times it was entered as an “other”.

Crowd-sourced choice lists are best for teasing out the most common language used by respondents. They are well suited to pilot questionnaires, providing the author the data to create a scripted closed-end question for the next round of the survey, a question that can then be extrapolated from with confidence.

Crowd-sourced choice lists are one of many techniques for crowd-shaping questionnaires, using the participation of earlier respondents to reshape the experience of later respondents. For a review of the other techniques and how they can improve your research, please download our white paper, “Crowd-Shaped Surveys”.

Author Notes:

Jeffrey Henning

Gravatar Image
Jeffrey Henning, IPC is a professionally certified researcher and has personally conducted over 1,400 survey research projects. Jeffrey is a member of the Insights Association and the American Association of Public Opinion Researchers. In 2012, he was the inaugural winner of the MRA’s Impact award, which “recognizes an industry professional, team or organization that has demonstrated tremendous vision, leadership, and innovation, within the past year, that has led to advances in the marketing research profession.” In 2022, the Insights Association named him an IPC Laureate. Before founding Researchscape in 2012, Jeffrey co-founded Perseus Development Corporation in 1993, which introduced the first web-survey software, and Vovici in 2006, which pioneered the enterprise-feedback management category. A 35-year veteran of the research industry, he began his career as an industry analyst for an Inc. 500 research firm.