Crowd Agreement Measure
(Redirected from Crowd Consensus Measure)
Jump to navigation
Jump to search
A Crowd Agreement Measure is an evaluation reliability measure that quantifies consensus levels among crowd workers accounting for worker quality variation and task difficulty.
- AKA: Crowd Consensus Measure, Worker Agreement Measure, Crowd Reliability Measure, Non-Expert Agreement Index.
- Context:
- It can typically incorporate Worker Quality Weights in agreement calculation.
- It can typically handle Sparse Annotation Matrixes from partial overlap.
- It can often detect Spammer Workers through agreement patterns.
- It can often adjust for Task Difficulty affecting expected agreement.
- It can support Quality Control Decisions in crowdsourcing workflows.
- It can enable Worker Performance Tracking for payment systems.
- It can facilitate Annotation Aggregation weighting reliable workers.
- It can integrate with Active Learning selecting ambiguous instances.
- It can range from being a Binary Crowd Agreement to being a Multi-Class Crowd Agreement, depending on its label type.
- It can range from being a Raw Crowd Agreement to being a Adjusted Crowd Agreement, depending on its correction method.
- It can range from being a Pairwise Crowd Agreement to being a Group Crowd Agreement, depending on its worker scope.
- It can range from being a Task-Level Agreement to being a Instance-Level Agreement, depending on its granularity.
- ...
- Examples:
- Statistical Agreement Measures, such as:
- Fleiss Kappa for Crowds handling multiple raters.
- Weighted Majority Agreement using worker reputation.
- MACE Agreement Score modeling worker competence.
- Platform-Specific Measures, such as:
- Task-Specific Agreements, such as:
- ...
- Statistical Agreement Measures, such as:
- Counter-Examples:
- Inter-Expert Agreement Measure, which measures expert consensus.
- Automated Agreement Score, which lacks human annotation.
- Individual Worker Measure, which ignores collective agreement.
- See: Evaluation Reliability Measure, Inter-Expert Agreement Measure, Crowdsourcing Quality Control, Worker Quality Measure, Crowd Adjudication Process, Agreement Coefficient, Crowd Annotation.