CrowdED and CREX: Towards Easy Crowdsourcing Quality Control Evaluation

Affiliation auteurs!!!! Error affiliation !!!!
TitreCrowdED and CREX: Towards Easy Crowdsourcing Quality Control Evaluation
Type de publicationConference Paper
Year of Publication2019
AuteursAwwad T, Bennani N, Rehn-Sonigo V, Brunie L, Kosch H
EditorWelzer T, Eder J, Podgorelec V, Latific AK
Conference NameADVANCES IN DATABASES AND INFORMATION SYSTEMS, ADBIS 2019
PublisherSPRINGER INTERNATIONAL PUBLISHING AG
Conference LocationGEWERBESTRASSE 11, CHAM, CH-6330, SWITZERLAND
ISBN Number978-3-030-28730-6; 978-3-030-28729-0
Mots-clésCrowdsourcing, Dataset, Extendable campaign, Generic platform, Quality Control
Résumé

Crowdsourcing is a time- and cost-efficient web-based technique for labeling large datasets like those used in Machine Learning. Controlling the output quality in crowdsourcing is an active research domain which has yielded a fair number of methods and approaches. Due to the quantitative and qualitative limitations of the existing evaluation datasets, comparing and evaluating these methods have been very limited. In this paper, we present CrowdED (Crowdsourcing Evaluation Dataset), a rich dataset for evaluating a wide range of quality control methods alongside with CREX (CReate Enrich eXtend), a framework that facilitates the creation of such datasets and guarantees their future-proofing and re-usability through customizable extension and enrichment.

DOI10.1007/978-3-030-28730-6_18