While crowdsourcing has been used in market research for years now, it is often compartmentalized as an input method – an open way of gathering ideas from customers, prospects, and employees. In fact, crowdsourcing can – and often should – be used throughout the research process, as case studies from AllOurIdeas, InSites Consulting, AOL, and IdeaScale make clear.

Idea Collection

The classic “use case” for crowdsourcing is creating a public or private community specifically for generating ideas. The favorite example still – six years on – is My Starbucks Idea, which has just generated nearly 200,000 ideas. The frequent issue raised by the success of such sites is: how do we truly analyze the results?

Idea Ranking

Many idea generation platforms greatly skew the rating of ideas by emphasizing “hot” ideas – for instance, the “Popular Ideas” tab on the Salesforce Ideas platform positively reinforces ideas that are popular, making it more difficult for worthwhile new ideas to break through, especially those that have scrolled off the Recent Ideas tab.

All Our Ideas offers a unique take on this – load it with your ideas, and then visitors to your site see two randomly selected ideas. They vote for the one they like better – every idea gets seen and evaluated, and visitors can’t stuff the ballot box. While contributors can submit new ideas as well, All Our Ideas is primarily a voting platform. For instance, a 2010 All Our Ideas study for creating “a greener, greater New York City” identified a bike-share program as one of the top ideas out of over 200 ideas generated; Citi Bike launched just last year.


At the Green Book IIEX conference in Philadelphia last year, Niels Schillewaert of InSites Consulting presented a case study for Air France-KLM looking at improving the transfer flight experience, which often produces a lot of negative emotions in travelers. After soliciting ideas from a community, InSites recruited a second community of 46 frequent fliers to analyze the ideas and add their own perspectives. They expanded the analysis from 26 insights to 68 and revised 6 marketing assumptions, providing an analysis that couldn’t have been accomplished by researchers alone.

At the 2013 ESOMAR 3D Digital Dimensions conference, Joseph Blechman mentioned the value of crowdsourcing the analysis of verbatim responses, specifically for categorization and sentiment analysis. He gave an example using multiple participants to evaluate each comment in order to improve reliability.

Research magazine recapped an IdeaScale community underway for the U.S. Navy. The Navy writes, “During the next four weeks, the CNO wants to hear from you: from the deck seaman to the goat locker, from the ensign to the Commanding Officer… submit ideas as well as vote and comment on other ideas. Collectively, the best ideas and feedback will ‘bubble to the top’.” To break out of the “hot ideas” bias, IdeaScale offers participants many ways to navigate ideas, including a search engine, tags, and recommendations to review similar ideas. All this to help the Navy with “Reducing Administrative Distractions” and putting “Warfighting First”.

While you should certainly put idea generation first when implementing crowdsourcing, make sure to expand your crowdsourcing to provide rich quantitative and qualitative assessment of ideas as well.

Author Notes:

Jeffrey Henning

Gravatar Image
Jeffrey Henning, IPC is a professionally certified researcher and has personally conducted over 1,400 survey research projects. Jeffrey is a member of the Insights Association and the American Association of Public Opinion Researchers. In 2012, he was the inaugural winner of the MRA’s Impact award, which “recognizes an industry professional, team or organization that has demonstrated tremendous vision, leadership, and innovation, within the past year, that has led to advances in the marketing research profession.” In 2022, the Insights Association named him an IPC Laureate. Before founding Researchscape in 2012, Jeffrey co-founded Perseus Development Corporation in 1993, which introduced the first web-survey software, and Vovici in 2006, which pioneered the enterprise-feedback management category. A 35-year veteran of the research industry, he began his career as an industry analyst for an Inc. 500 research firm.