Hierarchical Bayesian Metamodel

From GM-RKB
Jump to navigation Jump to search

A Hierarchical Bayesian Metamodel is a directed conditional statistical metamodel that describes a family of hierarchical Bayesian networks.



References

2011

  • http://en.wikipedia.org/wiki/Hierarchical_Bayes_model
    • The hierarchical Bayes method is a topic in modern Bayesian analysis. It is a powerful tool for expressing rich statistical models that more fully reflect a given problem than a simpler model could. Given data [math]\displaystyle{ x\,\! }[/math] and parameters [math]\displaystyle{ \vartheta }[/math], a simple Bayesian analysis starts with a prior probability (prior) [math]\displaystyle{ p(\vartheta) }[/math] and likelihood [math]\displaystyle{ p(x|\vartheta) }[/math] to compute a posterior probability [math]\displaystyle{ p(\vartheta|x) \propto p(x|\vartheta)p(\vartheta) }[/math]. Often the prior on [math]\displaystyle{ \vartheta }[/math] depends in turn on other parameters [math]\displaystyle{ \varphi }[/math] that are not mentioned in the likelihood. So, the prior [math]\displaystyle{ p(\vartheta) }[/math] must be replaced by a likelihood [math]\displaystyle{ p(\vartheta|\varphi) }[/math], and a prior [math]\displaystyle{ p(\varphi) }[/math] on the newly introduced parameters [math]\displaystyle{ \varphi }[/math] is required, resulting in a posterior probability [math]\displaystyle{ p(\vartheta,\varphi|x) \propto p(x|\vartheta)p(\vartheta|\varphi)p(\varphi). }[/math] This is the simplest example of a hierarchical Bayes model. The process may be repeated; for example, the parameters [math]\displaystyle{ \varphi }[/math] may depend in turn on additional parameters [math]\displaystyle{ \psi\,\! }[/math], which will require their own prior. Eventually the process must terminate, with priors that do not depend on any other unmentioned parameters.

2003