Gaussian Mixtures Algorithm

From GM-RKB
(Redirected from Gaussian mixtures)
Jump to navigation Jump to search

A Gaussian Mixtures Algorithm is a Gaussian Distribution that ...



References

2015

  • (Wikipedia, 2015) ⇒ http://en.wikipedia.org/wiki/Mixture_model#Gaussian_mixture_model Retrieved:2015-10-23.
    • A typical non-Bayesian Gaussian mixture model looks like this: : [math]\displaystyle{ \begin{array}{lcl} K,N &=& \text{as above} \\ \phi_{i=1 \dots K}, \boldsymbol\phi &=& \text{as above} \\ z_{i=1 \dots N}, x_{i=1 \dots N} &=& \text{as above} \\ \theta_{i=1 \dots K} &=& \{ \mu_{i=1 \dots K}, \sigma^2_{i=1 \dots K} \} \\ \mu_{i=1 \dots K} &=& \text{mean of component } i \\ \sigma^2_{i=1 \dots K} &=& \text{variance of component } i \\ z_{i=1 \dots N} &\sim& \operatorname{Categorical}(\boldsymbol\phi) \\ x_{i=1 \dots N} &\sim& \mathcal{N}(\mu_{z_i}, \sigma^2_{z_i}) \end{array} }[/math]

      A Bayesian version of a Gaussian mixture model is as follows: : [math]\displaystyle{ \begin{array}{lcl} \lt P\gt K,N &=& \text{as above} \\ \lt P\gt \phi_{i=1 \dots K}, \boldsymbol\phi &=& \text{as above} \\ z_{i=1 \dots N}, x_{i=1 \dots N} &=& \text{as above} \\ \theta_{i=1 \dots K} &=& \{ \mu_{i=1 \dots K}, \sigma^2_{i=1 \dots K} \} \\ \mu_{i=1 \dots K} &=& \text{mean of component } i \\ \sigma^2_{i=1 \dots K} &=& \text{variance of component } i \\ \mu_0, \lambda, \nu, \sigma_0^2 &=& \text{shared hyperparameters} \\ \mu_{i=1 \dots K} &\sim& \mathcal{N}(\mu_0, \lambda\sigma_i^2) \\ \sigma_{i=1 \dots K}^2 &\sim& \operatorname{Inverse-Gamma}(\nu, \sigma_0^2) \\ \boldsymbol\phi &\sim& \operatorname{Symmetric-Dirichlet}_K(\beta) \\ z_{i=1 \dots N} &\sim& \operatorname{Categorical}(\boldsymbol\phi) \\ x_{i=1 \dots N} &\sim& \mathcal{N}(\mu_{z_i}, \sigma^2_{z_i}) \end{array} }[/math]