In Crowdsourcing New Designs, Providing the Right Information is Critical

Share

Research by Michigan Ross Professor Damian Beil finds that the specifics of online contests can impact the quality of entries received.

Crowdsourcing illustration

Key takeaways:

  • Limiting the amount of conceptual background you provide in a creative brief can increase participation in crowdsourced design contests.

  • Providing more specific guidelines for execution yields the highest quality designs.

If you have a design challenge that lends itself to a crowdsourced solution, the way you frame the issue in the “creative brief” can affect the results you get, according to new research.

Damian Beil

In recent years, businesses have often turned to crowdsourcing — asking for help from a broad audience online, often in a contest format — for creative tasks such as designing logos or websites. A new research paper by Michigan Ross Professor Damian Beil and colleagues finds that the design requirements of such contests, and the information provided to participants, can impact the number and quality of entries.

The researchers — Beil; Zhaohui (Zoey) Jiang, a PhD student at Michigan Ross who will join the faculty the Tepper School of Business at Carnegie Mellon; and Yan Huang of Tepper — developed a theoretical model and used it to analyze data from actual contests conducted on a popular crowdsourcing platform.

They found that different types of information provided by the contest sponsor can have significantly different impacts on the behavior of the participants. Specifically, they found:

  • It is important to distinguish between different types of information in the creative brief  —  that is, conceptual objectives (what the challenge is) versus execution guidelines (how the challenge will be met).
  • Presenting too many conceptual objectives ratchets up the “cost” of entering and reduces the number of participants.
  • Adding execution guidelines results in more effort per participant and a higher quality of entries.
  • To get the highest quality solutions to crowdsourced design problems, organizers should always provide more execution guidelines and only a moderate number of conceptual objectives.

“These findings could improve current marketplace practices, as the platforms currently often encourage seekers to pack as much information as possible into their problem specification describing the design they want,” the researchers wrote.

Damian Beil is a professor of technology and operations and also the Ford Motor Company Co-Director of the Joel D. Tauber Institute for Global Operations at the University of Michigan Ross School of Business.

READ THE FULL PAPER

Media Contact: Bridget Vis, Public Relations Specialist, visb@umich.edu
 

Ross Thought In Action By Damian Beil
Damian Beil

Damian Beil

  • Professor of Technology and Operations
  • Ford Motor Company Co-Director of the Joel D. Tauber Institute for Global Operations
  • Chair of Technology and Operations