2014 CompositionalMorphologyforWordR

From GM-RKB
Jump to navigation Jump to search

Subject Headings: Word Embedding System; SumEmbed System, Compositional Morphological Representation System.

Notes

Cited By

Quotes

Abstract

This paper presents a scalable method for integrating compositional morphological representations into a vector-based probabilistic language model. Our approach is evaluated in the context of log-bilinear language models, rendered suitably efficient for implementation inside a machine translation decoder by factoring the vocabulary. We perform both intrinsic and extrinsic evaluations, presenting results on a range of languages which demonstrate that our model learns morphological representations that both perform well on word similarity tasks and lead to substantial reductions in perplexity. When used for translation into morphologically rich languages with large vocabularies, our models obtain improvements of up to 1.2 BLEU points relative to a baseline system using back-off n-gram models.

References

BibTeX

@inproceedings{2014_CompositionalMorphologyforWordR,
  author    = {Jan A. Botha and
               Phil Blunsom},
  title     = {Compositional Morphology for Word Representations and Language Modelling},
  booktitle = {Proceedings of the 31th International Conference on Machine Learning,
               (ICML 2014)},
  series    = {JMLR Workshop and Conference Proceedings},
  volume    = {32},
  pages     = {1899--1907},
  publisher = {JMLR.org},
  year      = {2014},
  url       = {http://proceedings.mlr.press/v32/botha14.html},
}


 AuthorvolumeDate ValuetitletypejournaltitleUrldoinoteyear
2014 CompositionalMorphologyforWordRPhil Blunsom
Jan A. Botha
Compositional Morphology for Word Representations and Language Modelling2014