Marginal Likelihood Function

From GM-RKB
(Redirected from marginal likelihood)
Jump to navigation Jump to search

A Marginal Likelihood Function is a likelihood function in which some parameter variables have been marginalized.



References

2014a

2014b

  • (Wikipedia, 2014) ⇒ http://en.wikipedia.org/wiki/Normalizing_constant#Bayes Retrieved:2014-1-15.
    • Bayes' theorem says that the posterior probability measure is proportional to the product of the prior probability measure and the likelihood function. Proportional to implies that one must multiply or divide by a normalizing constant to assign measure 1 to the whole space, i.e., to get a probability measure. In a simple discrete case we have  :[math]\displaystyle{ P(H_0|D) = \frac{P(D|H_0)P(H_0)}{P(D)} }[/math]

      where P(H0) is the prior probability that the hypothesis is true; P(D|H0) is the conditional probability of the data given that the hypothesis is true, but given that the data are known it is the likelihood of the hypothesis (or its parameters) given the data; P(H0|D) is the posterior probability that the hypothesis is true given the data. P(D) should be the probability of producing the data, but on its own is difficult to calculate, so an alternative way to describe this relationship is as one of proportionality:  :[math]\displaystyle{ P(H_0|D) \propto P(D|H_0)P(H_0). }[/math]

      Since P(H|D) is a probability, the sum over all possible (mutually exclusive) hypotheses should be 1, leading to the conclusion that  :[math]\displaystyle{ P(H_0|D) = \frac{P(D|H_0)P(H_0)}{\displaystyle\sum_i P(D|H_i)P(H_i)} . }[/math]

      In this case, the reciprocal of the value  :[math]\displaystyle{ P(D)=\sum_i P(D|H_i)P(H_i) \; }[/math]

      is the normalizing constant. [1] It can be extended from countably many hypotheses to uncountably many by replacing the sum by an integral.

  1. Feller, 1968, p. 124.

2013

2010

2003