Task search in a human computation market
Top Cited Papers
Open Access
- 25 July 2010
- proceedings article
- Published by Association for Computing Machinery (ACM)
Abstract
In order to understand how a labor market for human computation functions, it is important to know how workers search for tasks. This paper uses two complementary methods to gain insight into how workers search for tasks on Mechanical Turk. First, we perform a high frequency scrape of 36 pages of search results and analyze it by looking at the rate of disappearance of tasks across key ways Mechanical Turk allows workers to sort tasks. Second, we present the results of a survey in which we paid workers for self-reported information about how they search for tasks. Our main findings are that on a large scale, workers sort by which tasks are most recently posted and which have the largest number of tasks available. Furthermore, we find that workers look mostly at the first page of the most recently posted tasks and the first two pages of the tasks with the most available instances but in both categories the position on the result page is unimportant to workers. We observe that at least some employers try to manipulate the position of their task in the search results to exploit the tendency to search for recently posted tasks. On an individual level, we observed workers searching by almost all the possible categories and looking more than 10 pages deep. For a task we posted to Mechanical Turk, we confirmed that a favorable position in the search results do matter: our task with favorable positioning was completed 30 times faster and for less money than when its position was unfavorable.National Science Foundation (U.S.). Integrative Graduate Education and Research Traineeship (Multidisciplinary Program in Inequality & Social Policy) (Grant Number 033340Keywords
Funding Information
- National Science Foundation (333403)
This publication has 8 references indexed in Scilit:
- The labor economics of paid crowdsourcingPublished by Association for Computing Machinery (ACM) ,2010
- Crowdsourcing, attention and productivityJournal of Information Science, 2009
- Financial incentives and the "performance of crowds"Published by Association for Computing Machinery (ACM) ,2009
- ZoetropePublished by Association for Computing Machinery (ACM) ,2008
- Labeling images with a computer gamePublished by Association for Computing Machinery (ACM) ,2004
- Finding the flow in web site searchCommunications of the ACM, 2002
- Wiring the Labor MarketJournal of Economic Perspectives, 2001
- Using collaborative filtering to weave an information tapestryCommunications of the ACM, 1992