CrowdED and CREX: Towards Easy Crowdsourcing Quality Control Evaluation
Affiliation auteurs | !!!! Error affiliation !!!! |
Titre | CrowdED and CREX: Towards Easy Crowdsourcing Quality Control Evaluation |
Type de publication | Conference Paper |
Year of Publication | 2019 |
Auteurs | Awwad T, Bennani N, Rehn-Sonigo V, Brunie L, Kosch H |
Editor | Welzer T, Eder J, Podgorelec V, Latific AK |
Conference Name | ADVANCES IN DATABASES AND INFORMATION SYSTEMS, ADBIS 2019 |
Publisher | SPRINGER INTERNATIONAL PUBLISHING AG |
Conference Location | GEWERBESTRASSE 11, CHAM, CH-6330, SWITZERLAND |
ISBN Number | 978-3-030-28730-6; 978-3-030-28729-0 |
Mots-clés | Crowdsourcing, Dataset, Extendable campaign, Generic platform, Quality Control |
Résumé | Crowdsourcing is a time- and cost-efficient web-based technique for labeling large datasets like those used in Machine Learning. Controlling the output quality in crowdsourcing is an active research domain which has yielded a fair number of methods and approaches. Due to the quantitative and qualitative limitations of the existing evaluation datasets, comparing and evaluating these methods have been very limited. In this paper, we present CrowdED (Crowdsourcing Evaluation Dataset), a rich dataset for evaluating a wide range of quality control methods alongside with CREX (CReate Enrich eXtend), a framework that facilitates the creation of such datasets and guarantees their future-proofing and re-usability through customizable extension and enrichment. |
DOI | 10.1007/978-3-030-28730-6_18 |