Evaluating design solutions using crowds

Jin Bao, Yasuaki Sakamoto, Jeffrey V. Nickerson

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

27 Scopus citations

Abstract

Crowds can be used to generate and evaluate design solutions. To increase a crowdsourcing system's effectiveness, we propose and compare two evaluation methods, one using five-point Liked scale rating and the other premction voting. Our results indicate that although the two evaluation methods correlate, they have mfferent goals: whereas prediction voting focuses evaluators on identifying the very best solutions, the rating focuses evaluators on the entire range of solutions. Thus, premction voting is appropriate when there are many poor quality solutions that need to be filtered out, and rating is suited when all ideas are reasonable and mstinctions need to be made across all solutions. The crowd prefers participating in premction voting. The results have pragmatic implications, suggesting that evaluation methods should be assigned in relation to the mstnbution of quality present at each stage of crowdsourcing.

Original languageEnglish
Title of host publication17th Americas Conference on Information Systems 2011, AMCIS 2011
Pages3923-3931
Number of pages9
DOIs
StatePublished - 2011
Event17th Americas Conference on Information Systems 2011, AMCIS 2011 - Detroit, MI, United States
Duration: 4 Aug 20118 Aug 2011

Publication series

Name17th Americas Conference on Information Systems 2011, AMCIS 2011
Volume5

Conference

Conference17th Americas Conference on Information Systems 2011, AMCIS 2011
Country/TerritoryUnited States
CityDetroit, MI
Period4/08/118/08/11

Keywords

  • Creativity
  • Crowdsourcing
  • Evaluation
  • Human-based genetic algorithms
  • Mechanical turk

Fingerprint

Dive into the research topics of 'Evaluating design solutions using crowds'. Together they form a unique fingerprint.

Cite this