With the emergence of crowdsourcing systems the human labeled data are collected easier faster and more efficient Due to the anonymous nature of crowdsourcing platform the quality of the crowd workers is difficult to guarantee The human variance and the noisy annotators may lead to the incorrect result A conventional approach is to consult different workers via collect repeated labels Moreover truth inference technique plays a vital role in tackling the human noise in thus collected data However higher quality training dataset usually comes with the more quantity labels as well as budget Besides worker designation is not supported in current crowdsourcing platforms which makes it more challenging to save money by distinguishing good workers from noisy workers In this thesis we propose a framework that can leverage the worker qualification group selection and assign tasks based on the fitness between the qualification groups and the tasks We anticipate our framework to be a practical strategy for solving the budget allocation problem on crowdsourcing platforms
Date of Award | 2018 Aug 13 |
---|
Original language | English |
---|
Supervisor | Kun-Ta Chuang (Supervisor) |
---|
Strategies of Sequential Budget Allocation without Worker Designation in Crowdsourcing
唯, 李. (Author). 2018 Aug 13
Student thesis: Master's Thesis