Title: Software crowdsourcing reliability: an empirical study on developers behavior
Abstract:Crowdsourcing has become an emergent paradigm for software production in recent decades. Its open-call format attracts the participation of hundreds of thousands of developers. To ensure the success o...Crowdsourcing has become an emergent paradigm for software production in recent decades. Its open-call format attracts the participation of hundreds of thousands of developers. To ensure the success of software crowdsourcing, we must accurately measure and monitor the reliability of participating crowd workers, which, surprisingly, has rarely been done. To that end, this paper aims to examine the dependability of crowd workers in selecting tasks for software crowdsourcing. Empirical analysis of worker behaviors will investigate the following: (1) workers' behavior when registering and carrying out the announced tasks; (2) the relationship between rewards and performance; (3) the effects of development type among different groups; and (4) the evolution of workers' behavior according to the skills they have adopted. This study's findings include: (1) On average, most reliable crowdsourcing group responds to a task call within 10% of their allotted time and completes the assigned work in less than 5% of that time. (2) Crowd workers tend to focus on tasks according to specific ranges of rewards and types of challenges, based on their skill levels. (3) Crowd skills spread evenly across the entire range of groups. In summary, our results can guide future research into crowdsourcing service design and can inform ideas for crowdsourcing strategy conception according to time, reward, development type, and other aspects of crowdsourcing.Read More
Publication Year: 2016
Publication Date: 2016-11-01
Language: en
Type: article
Indexed In: ['crossref']
Access and Citation
Cited By Count: 10
AI Researcher Chatbot
Get quick answers to your questions about the article from our AI researcher chatbot