This survey deals with the problem of evaluating the submissions to crowd sourcing websites on which data is increasing rapidly in both volume and complexity. Usually expert committees are installed to rate submissions, select winners and adjust monetary rewards. Thus, with an increasing number of submissions, this process is getting more complex, time-consuming and hence expensive. In this paper we suggest following text mining methodology, foremost similarity measurements and clustering algorithms, to evaluate the quality of submissions to crowd sourcing contests semi-automatically. We evaluate our approach by comparing text mining based measurement of more than 40'000 submissions with the real-world decisions made by expert committees using Precision and Recall together with F1-score.
Language
English
HSG Classification
contribution to scientific community
HSG Profile Area
SoM - Business Innovation
Refereed
Yes
Book title
2013 46th Hawaii International Conference on System Sciences
Publisher
IEEE Computer Society
Publisher place
Los Alamitos, CA
Volume
1. Aufl.
Start page
3109
End page
3118
Pages
10
Event Title
46th Hawaii International Conference on System Sciences (HICSS)