2016 AchievingOpenVocabularyNeuralMa

From GM-RKB
Jump to navigation Jump to search

Subject Headings: Neural Machine Translation Task; Natural Language Generation Task.

Notes

Computing Resource(s):

Pre-Print(s) and Other Link(s):

Cited By

Quotes

Abstract

Nearly all previous work on neural machine translation (NMT) has used quite restricted vocabularies, perhaps with a subsequent method to patch in unknown words. This paper presents a novel wordcharacter solution to achieving open vocabulary NMT. We build hybrid systems that translate mostly at the word level and consult the character components for rare words. Our character-level recurrent neural networks compute source word representations and recover unknown target words when needed. The twofold advantage of such a hybrid approach is that it is much faster and easier to train than character-based ones; at the same time, it never produces unknown words as in the case of word-based models. On the WMT'15 English to Czech translation task, this hybrid approach offers an addition boost of +2.1-11.4BLEU points over models that already handle unknown words. Our best system achieves a new state-of-the-art result with 20.7 BLEU score. We demonstrate that our character models can successfully learn to not only generate well-formed words for Czech, a highly-inflected language with a very complex vocabulary, but also build correct representations for English source words.

References

BibTeX

@inproceedings{2016_AchievingOpenVocabularyNeuralMa,
  author    = {Minh-Thang Luong and
               Christopher D. Manning},
  title     = {Achieving Open Vocabulary Neural Machine Translation with Hybrid Word-Character
               Models},
  booktitle = {Proceedings of the 54th Annual Meeting of the Association for Computational
               Linguistics (ACL 2016), August 7-12, 2016, Berlin, Germany, Volume
               1: Long Papers},
  publisher = {The Association for Computer Linguistics},
  year      = {2016},
  url       = {https://doi.org/10.18653/v1/p16-1100},
  doi       = {10.18653/v1/p16-1100},
}


 AuthorvolumeDate ValuetitletypejournaltitleUrldoinoteyear
2016 AchievingOpenVocabularyNeuralMaChristopher D. Manning
Minh-Thang Luong
Achieving Open Vocabulary Neural Machine Translation with Hybrid Word-Character Models2016