2007 StrucLocalTrainCRFsCorefResol

From GM-RKB
Jump to navigation Jump to search

Subject Headings: Conditional Random Field, Coreference Resolution

Notes

Cited By

Quotes

Abstract

Conditional Random Fields (CRFs) have shown great success for problems involving structured output variables. However, for many real-world NLP applications, exact maximum-likelihood training is intractable because computing the global normalization factor even approximately can be extremely hard. In addition, optimizing likelihood often does not correlate with maximizing task-specific evaluation measures. In this paper, we present a novel training procedure, structured local training, that maximizes likelihood while exploiting the benefits of global inference during training: hidden variables are used to capture interactions between local inference and global inference. Furthermore, we introduce biased potential functions that empirically drive CRFs towards performance improvements w.r.t. the preferred evaluation measure for the learning task. We report promising experimental results on two coreference data sets using two task-specific evaluation measures.

References


,

 AuthorvolumeDate ValuetitletypejournaltitleUrldoinoteyear
2007 StrucLocalTrainCRFsCorefResolYejin Choi
Claire Cardie
Structured Local Training and Biased Potential Functions for Conditional Random Fields with Application to Coreference ResolutionProceedings of NAACL HLThttp://acl.ldc.upenn.edu/N/N07/N07-1009.pdf2007