2015 StochasticDivergenceMinimizatio

From GM-RKB
Jump to navigation Jump to search

Subject Headings:

Notes

Cited By

Quotes

Author Keywords

Abstract

The collapsed variational Bayes zero (CVB0) inference is a variational inference improved by marginalizing out parameters, the same as with the collapsed Gibbs sampler. A drawback of the CVB0 inference is the memory requirements. A probability vector must be maintained for latent topics for every token in a corpus. When the total number of tokens is N and the number of topics is K, the CVB0 inference requires Ο (NK) memory. A stochastic approximation of the CVB0 (SCVB0) inference can reduce Ο (NK) to Ο (VK), where V denotes the vocabulary size. We reformulate the existing SCVB0 inference by using the stochastic divergence minimization algorithm, with which convergence can be analyzed in terms of Martingale convergence theory. We also reveal the property of the CVB0 inference in terms of the leave-one-out perplexity, which leads to the estimation algorithm of the Dirichlet distribution parameters. The predictive performance of the propose SCVB0 inference is better than that of the original SCVB0 inference in four datasets.

References

;

 AuthorvolumeDate ValuetitletypejournaltitleUrldoinoteyear
2015 StochasticDivergenceMinimizatioHiroshi Nakagawa
Issei Sato
Stochastic Divergence Minimization for Online Collapsed Variational Bayes Zero Inference of Latent Dirichlet Allocation10.1145/2783258.27833552015