2016 MorphologicalPriorsforProbabili

From GM-RKB
Jump to navigation Jump to search

Subject Headings: Word Embedding System; VarEmbed System; SumEmbed System; Probabilistic Neural Word Embedding System.

Notes

Cited By

Quotes

Abstract

Word embeddings allow natural language processing systems to share statistical information across related words. These embeddings are typically based on distributional statistics, making it difficult for them to generalize to rare or unseen words. We propose to improve word embeddings by incorporating morphological information, capturing shared sub-word features. Unlike previous work that constructs word embeddings directly from morphemes, we combine morphological and distributional information in a unified probabilistic framework, in which the word embedding is a latent variable. The morphological information provides a prior distribution on the latent word embeddings, which in turn condition a likelihood function over an observed corpus. This approach yields improvements on intrinsic word similarity evaluations, and also in the downstream task of part-of-speech tagging.

References

BibTeX

@inproceedings{2016_MorphologicalPriorsforProbabili,
  author    = {Parminder Bhatia and
               Robert Guthrie and
               Jacob Eisenstein},
  editor    = {Jian Su and
               Xavier Carreras and
               Kevin Duh},
  title     = {Morphological Priors for Probabilistic Neural Word Embeddings},
  booktitle = {Proceedings of the 2016 Conference on Empirical Methods in Natural
               Language Processing (EMNLP 2016)},
  pages     = {490--500},
  publisher = {The Association for Computational Linguistics},
  year      = {2016},
  url       = {https://doi.org/10.18653/v1/d16-1047},
  doi       = {10.18653/v1/d16-1047},
}


 AuthorvolumeDate ValuetitletypejournaltitleUrldoinoteyear
2016 MorphologicalPriorsforProbabiliJacob Eisenstein
Robert Guthrie
Parminder Bhatia
Morphological Priors for Probabilistic Neural Word Embeddings2016