2016 GloballyNormalizedTransitionbas

From GM-RKB
Jump to navigation Jump to search

Subject Headings: SyntaxNet, Parse McParseface Model.

Notes

Cited By

2016

Quotes

Abstract

We introduce a globally normalized transition-based neural network model that achieves state-of-the-art part-of-speech tagging, dependency parsing and sentence compression results. Our model is a simple feed-forward neural network that operates on a task-specific transition system, yet achieves comparable or better accuracies than recurrent models. The key insight is based on a novel proof illustrating the label bias problem and showing that globally normalized models can be strictly more expressive than locally normalized models.

References

;

 AuthorvolumeDate ValuetitletypejournaltitleUrldoinoteyear
2016 GloballyNormalizedTransitionbasMichael Collins
Slav Petrov
Kuzman Ganchev
Daniel Andor
Chris Alberti
David Weiss
Aliaksei Severyn
Alessandro Presta
Globally Normalized Transition-based Neural Networks