# Likelihood Function

(Redirected from likelihood)

## References

### 2015

• (Wikipedia, 2015) ⇒ http://en.wikipedia.org/wiki/Likelihood_function#Historical_remarks Retrieved:2015-6-4.
• In statistics, a likelihood function (often simply the likelihood) is a function of the parameters of a statistical model.

Likelihood functions play a key role in statistical inference, especially methods of estimating a parameter from a set of statistics. In informal contexts, "likelihood" is often used as a synonym for “probability." But in statistical usage, a distinction is made depending on the roles of the outcome or parameter. Probability is used when describing a function of the outcome given a fixed parameter value. For example, if a coin is flipped 10 times and it is a fair coin, what is the probability of it landing heads-up every time? Likelihood is used when describing a function of a parameter given an outcome. For example, if a coin is flipped 10 times and it has landed heads-up 10 times, what is the likelihood that the coin is fair?

### 2014

• (Wikipedia, 2014) ⇒ http://en.wikipedia.org/wiki/likelihood_function#Definition Retrieved:2014-12-10.
• The likelihood function is defined differently for discrete and continuous probability distributions.
• Discrete probability distribution
• Let X be a random variable with a discrete probability distribution p depending on a parameter θ. Then the function  :$\mathcal{L}(\theta |x) = p_\theta (x) = P_\theta (X=x), \,$

considered as a function of θ, is called the likelihood function (of θ, given the outcome x of X). Sometimes the probability on the value x of X for the parameter value θ is written as $P(X=x|\theta)$; often written as $P(X=x;\theta)$ to emphasize that this value is not a conditional probability, because θ is a parameter and not a random variable.

• Continuous probability distribution
• Let X be a random variable with a continuous probability distribution with density function f depending on a parameter θ. Then the function  :$\mathcal{L}(\theta |x) = f_{\theta} (x), \,$

considered as a function of θ, is called the likelihood function (of θ, given the outcome x of X). Sometimes the density function for the value x of X for the parameter value θ is written as $f(x|\theta)$, but should not be considered as a conditional probability density.

The actual value of a likelihood function bears no meaning. Its use lies in comparing one value with another. For example, one value of the parameter may be more likely than another, given the outcome of the sample. Or a specific value will be most likely: the maximum likelihood estimate. Comparison may also be performed in considering the quotient of two likelihood values. That is why $\mathcal{L}(\theta |x)$ is generally permitted to be any positive multiple of the above defined function $\mathcal{L}$. More precisely, then, a likelihood function is any representative from an equivalence class of functions,  :$\mathcal{L} \in \left\lbrace \alpha \; P_\theta: \alpha \gt 0 \right\rbrace, \,$

where the constant of proportionality α > 0 is not permitted to depend upon θ, and is required to be the same for all likelihood functions used in any one comparison. In particular, the numerical value $\mathcal{L}(\theta |x)$ alone is immaterial; all that matters are maximum values of $\mathcal{L}$, or likelihood ratios, such as those of the form  :$\frac{\mathcal{L}(\theta_2 | x)}{\mathcal{L}(\theta_1 | x)} \lt P\gt = \frac{\alpha P(X=x|\theta_2)}{\alpha P(X=x|\theta_1)} = \frac{P(X=x|\theta_2)}{P(X=x|\theta_1)},$

that are invariant with respect to the constant of proportionality α.

For more about making inferences via likelihood functions, see also the method of maximum likelihood, and likelihood-ratio testing.