Abstract  ||
Designing reliable crowdsourcing systems: efficient algorithms and fundamental limits

UIUC

 

Abstract
Crowdsourcing systems, like Amazon Mechanical Turk, provide platforms where large-scale projects are broken into small tasks that are electronically distributed to numerous on-demand contributors. Because these low-paid workers can be unreliable, we need to devise schemes to increase confidence in our answers, typically by assigning each task multiple times and combining the answers in some way. I will present a rigorous treatment of this problem, and provide both an optimal task assignment scheme (using a random graph) and an optimal inference algorithm (based on low-rank matrix approximation and belief propagation) for that task assignment. This approach significantly outperforms previous approaches and, in fact, is asymptotically order-optimal, which is established through comparisons to an oracle estimator. Therefore, in terms of the budget required to achieve a certain reliability, our approach is order-optimal. Further, we show that even if we use adaptive schemes to assign tasks based on all the responses collected thus far, the necessary budget to achieve a certain target reliability still scales in the same manner. Hence, perhaps surprisingly, the gain in using adaptive task allocation schemes is only marginal.


Bio:
Sewoong Oh is an Assistant Professor of Industrial and Enterprise Systems Engineering at UIUC. His research interest is in understanding how to extract meaningful information from societal data, such as aggregating opinions on social computation platforms like Mechanical Turk, making recommendations from rating of individuals, and finding ranking from comparisons.