2007 ThreeNewGraphicalModelsforStati

From GM-RKB
Jump to navigation Jump to search

Subject Headings:

Notes

Cited By

Quotes

Abstract

The supremacy of [n-gram models in statistical language modelling has recently been challenged by parametric models that use distributed representations to counteract the difficulties caused by data sparsity. We propose three new probabilistic language models that define the distribution of the next word in a sequence given several preceding words by using distributed representations of those words. We show how real-valued distributed representations for words can be learned at the same time as learning a large set of stochastic binary hidden features that are used to predict the distributed representation of the next word from previous distributed representations. Adding connections from the previous states of the binary hidden features improves performance as does adding direct connections between the real-valued distributed representations. One of our models significantly outperforms the very best n-gram models.

References

;

 AuthorvolumeDate ValuetitletypejournaltitleUrldoinoteyear
2007 ThreeNewGraphicalModelsforStatiGeoffrey E. Hinton
Andriy Mnih
Three New Graphical Models for Statistical Language Modelling10.1145/1273496.12735772007