Bayesian Parameter Estimation Algorithm

Jump to navigation Jump to search

A Bayesian Parameter Estimation Algorithm is a parameter estimation algorithm that is a Bayesian algorithm.



    • It is often desired to use a posterior distribution to estimate a parameter or variable. Several methods of Bayesian estimation select measurements of central tendency from the posterior distribution. For one-dimensional problems, a unique median exists for practical continuous problems. The posterior median is attractive as a robust estimator.[1] If there exists a finite mean for the posterior distribution, then the posterior mean is a method of estimation.[citation needed] :[math]\displaystyle{ \tilde \theta = \operatorname{E}[\theta] = \int_\theta \theta \, p(\theta \mid \mathbf{X},\alpha) \, d\theta }[/math] Taking a value with the greatest probability defines maximum a posteriori (MAP) estimates:[citation needed] :[math]\displaystyle{ \{ \theta_{\text{MAP}}\} \subset \arg \max_\theta p(\theta \mid \mathbf{X},\alpha) . }[/math]

      There are examples where no maximum is attained, in which case the set of MAP estimates is empty.

      There are other methods of estimation that minimize the posterior risk (expected-posterior loss) with respect to a loss function, and these are of interest to statistical decision theory using the sampling distribution ("frequentist statistics").[citation needed]

      The posterior predictive distribution of a new observation [math]\displaystyle{ \tilde{x} }[/math] (that is independent of previous observations) is determined by[citation needed] :[math]\displaystyle{ p(\tilde{x}|\mathbf{X},\alpha) = \int_\theta p(\tilde{x},\theta \mid \mathbf{X},\alpha) \, d\theta = \int_\theta p(\tilde{x} \mid \theta) p(\theta \mid \mathbf{X},\alpha) \, d\theta . }[/math]

  1. Sen, Pranab K.; Keating, J. P.; Mason, R. L. (1993). Pitman's measure of closeness: A comparison of statistical estimators. Philadelphia: SIAM.