2009 ScalablePseudoLikelihoodEstimat

From GM-RKB
Jump to: navigation, search

Subject Headings:

Notes

Cited By

Quotes

Author Keywords

Bayesian Networks, Hybrid Random Fields, Markov Random Fields, Modularity, Scalability

Abstract

Learning probabilistic graphical models from high-dimensional dataset s is a computationally challenging task. In many interesting applications, the domain dimensionality is such as to prevent state-of-the-art statistical learning techniques from delivering accurate models in reasonable time. This paper presents a hybrid random field model for pseudo-likelihood estimation in high-dimensional domains. A theoretical analysis proves that the class of pseudo-likelihood distributions representable by hybrid random fields strictly includes the class of joint probability distributions representable by Bayesian networks. In order to learn hybrid random fields from data, we develop the Markov Blanket Merging algorithm. Theoretical and experimental evidence shows that Markov Blanket Merging scales up very well to high-dimensional dataset s. As compared to other widely used statistical learning techniques, Markov Blanket Merging delivers accurate results in a number of link prediction tasks, while achieving also significant improvements in terms of computational efficiency.

Our software implementation of the models investigated in this paper is publicly available at http://www.dii.unisi.it/~freno/. The same website also hosts the datasets used in this work that are not available elsewhere in the same preprocessing used for our experiments.

References

,

 AuthorvolumeDate ValuetitletypejournaltitleUrldoinoteyear
2009 ScalablePseudoLikelihoodEstimatAntonino Freno
Edmondo Trentin
Marco Gori
Scalable Pseudo-likelihood Estimation in Hybrid Random FieldsKDD-2009 Proceedings10.1145/1557019.15570592009