Anchoring and Agreement in Syntactic Annotations

TitleAnchoring and Agreement in Syntactic Annotations
Publication TypeCBMM Memos
Year of Publication2016
AuthorsBerzak, Y, Huang, Y, Barbu, A, Korhonen, A, Katz, B
Date Published09/2016
Abstract

Published in the Proceedings of EMNLP 2016

We present a study on two key characteristics of human syntactic annotations: anchoring and agreement. Anchoring is a well-known cognitive bias in human decision making, where judgments are drawn towards preexisting values. We study the influence of anchoring on a standard approach to creation of syntactic resources where syntactic annotations are obtained via human editing of tagger and parser output. Our experiments demonstrate a clear anchoring effect and reveal unwanted consequences, including overestimation of parsing performance and lower quality of annotations in comparison with human-based annotations. Using sentences from the Penn Treebank WSJ, we also report systematically obtained inter-annotator agreement estimates for English dependency parsing. Our agreement results control for parser bias, and are consequential in that they are on par with state of the art parsing performance for English newswire. We discuss the impact of our findings on strategies for future annotation efforts and parser evaluations.

arXiv

https://arxiv.org/abs/1605.04481

DSpace@MIT

http://hdl.handle.net/1721.1/104453

Download:  PDF icon CBMM-Memo-055.pdf
CBMM Memo No:  055

Research Area: 

CBMM Relationship: 

  • CBMM Related