Bayes Rule

From GM-RKB
(Redirected from Bayes' theorem)
Jump to navigation Jump to search

A Bayes Rule is a probability update rule which states that you must multiply the prior probability (that a belief is true) by the probability that the evidence is true given that the belief is true divided by the probability that the evidence is true regardless of whether the belief is true.

  • AKA: Bayes Theorem.
  • Context:
    • It can be stated as,if [math]\displaystyle{ E_1,E_2,\dots,E_n }[/math] are mutually disjoint events with a priori probabilities [math]\displaystyle{ P(E_i)\neq 0,(i=1,2,\dots,n) }[/math] then for any arbitrary event [math]\displaystyle{ A }[/math] which is a subset of [math]\displaystyle{ \displaystyle\bigcup_{i=1}^{n}E_i }[/math], such that [math]\displaystyle{ P(A)\gt 0 }[/math], we have the posterior probabilities [math]\displaystyle{ P(E_i|A)=\frac{P(E_i)P(A|E_i)}{\displaystyle\sum_{i=1}^{n}P(E_i)P(A|E_i)},i=1,2,\dots,n. }[/math] Here [math]\displaystyle{ P(A|E_i),i=1,2,\dots,n }[/math] are called likelihoods.
    • It can be used by a Bayesian Inference Algorithm.
    • It can be proved by application of the Product Rule.
    • It can be used as a Decision Rule based on minimizing Average Loss.
    • It can be restated as “The plausibility of your belief depends on the degree to which your belief -- and only your belief--explains the evidence for it. The more alternative explanations there are for the evidence, the less plausible your belief is.[1]
  • Example(s):
    • Posterior Probability = The Prior × Normalized Likelihood.
    • [math]\displaystyle{ Pr(A \vert B) = \frac{Pr(B \vert A)}{1} \times \frac{Pr(A)}{Pr(B)} }[/math].
    • In answering a question on a multiple choice test, a student either knows the answer (with probability [math]\displaystyle{ p }[/math]) or he guesses (with probability [math]\displaystyle{ 1-p }[/math]).Assume that the probability of answering a question correctly is unity for a student who knows the answer and [math]\displaystyle{ \frac{1}{m} }[/math] for the student who guesses, where [math]\displaystyle{ m }[/math] is the number of multiple choice alternatives. Supposing a student answers a question correctly, the probability that he really knows the answer can be found out by using the Bayes theorem as follows, let [math]\displaystyle{ E_1= }[/math]The student knows the answer, [math]\displaystyle{ E_2= }[/math]The student guesses the answer and [math]\displaystyle{ A= }[/math]The student answers correctly. Then [math]\displaystyle{ P(E_1)=p,P(E_2)=1-p,P(A|E_1)=1 }[/math] and [math]\displaystyle{ P(A|E_2)=\frac{1}{m} }[/math].Now using Bayes theorem, the probability that a student really knows the answer given that the student answers it correctly is [math]\displaystyle{ P(E_1|A)=\frac{P(E_1)P(A|E_1)}{P(E_1)P(A|E_1)+P(E_2)P(A|E_2)}=\frac{p.1}{p.1+(1-p).\frac{1}{m}}=\frac{mp}{1+(m-1)p} }[/math]
  • Counter-Example(s):
  • See: Belief Revision; Probability Theory; Bayesian Network; Naive-Bayes Model; Naive-Bayes Classifier; Bayesian Probability; Bayesianist; Bayesian Methods; Bayesian Network; Bayesian Model Selection, Bayes Networks, Statistical Proof.


References

2015

  • (Wikipedia, 2015) ⇒ http://en.wikipedia.org/wiki/Bayes'_rule Retrieved:2015-9-15.
    • In probability theory and applications, Bayes' rule relates the odds of event [math]\displaystyle{ A_1 }[/math] to the odds of event [math]\displaystyle{ A_2 }[/math] , before (prior to) and after (posterior to) conditioning on another event [math]\displaystyle{ B }[/math] . The odds on [math]\displaystyle{ A_1 }[/math] to event [math]\displaystyle{ A_2 }[/math] is simply the ratio of the probabilities of the two events. The prior odds is the ratio of the unconditional or prior probabilities, the posterior odds is the ratio of conditional or posterior probabilities given the event [math]\displaystyle{ B }[/math] . The relationship is expressed in terms of the likelihood ratio or Bayes factor, [math]\displaystyle{ \Lambda }[/math] . By definition, this is the ratio of the conditional probabilities of the event [math]\displaystyle{ B }[/math] given that [math]\displaystyle{ A_1 }[/math] is the case or that [math]\displaystyle{ A_2 }[/math] is the case, respectively. The rule simply states: posterior odds equals prior odds times Bayes factor (Gelman et al., 2005, Chapter 1).

      When arbitrarily many events [math]\displaystyle{ A }[/math] are of interest, not just two, the rule can be rephrased as posterior is proportional to prior times likelihood, [math]\displaystyle{ P(A|B)\propto P(A) P(B|A) }[/math] where the proportionality symbol means that the left hand side is proportional to (i.e., equals a constant times) the right hand side as [math]\displaystyle{ A }[/math] varies, for fixed or given [math]\displaystyle{ B }[/math] (Lee, 2012; Bertsch McGrayne, 2012). In this form it goes back to Laplace (1774) and to Cournot (1843); see Fienberg (2005).

      Bayes' rule is an equivalent way to formulate Bayes' theorem. If we know the odds for and against [math]\displaystyle{ A }[/math] we also know the probabilities of [math]\displaystyle{ A }[/math] . It may be preferred to Bayes' theorem in practice for a number of reasons.

      Bayes' rule is widely used in statistics, science and engineering, for instance in model selection, probabilistic expert systems based on Bayes networks, statistical proof in legal proceedings, email spam filters, and so on (Rosenthal, 2005; Bertsch McGrayne, 2012). As an elementary fact from the calculus of probability, Bayes' rule tells us how unconditional and conditional probabilities are related whether we work with a frequentist interpretation of probability or a Bayesian interpretation of probability. Under the Bayesian interpretation it is frequently applied in the situation where [math]\displaystyle{ A_1 }[/math] and [math]\displaystyle{ A_2 }[/math] are competing hypotheses, and [math]\displaystyle{ B }[/math] is some observed evidence. The rule shows how one's judgement on whether [math]\displaystyle{ A_1 }[/math] or [math]\displaystyle{ A_2 }[/math] is true should be updated on observing the evidence [math]\displaystyle{ B }[/math] (Gelman et al., 2003).


  • (Wikipedia, 2015) ⇒ http://en.wikipedia.org/wiki/Bayes'_theorem Retrieved:2015-9-15.
    • In probability theory and statistics, Bayes' theorem (alternatively Bayes' law or Bayes' rule) describes the probability of an event, based on conditions that might be related to the event. For example, suppose one is interested in whether Addison has cancer. Furthermore, suppose that Addison is age 65. If cancer is related to age, information about Addison's age can be used to more accurately assess his or her chance of having cancer using Bayes' Theorem.

      When applied, the probabilities involved in Bayes' theorem may have different interpretations. In one of these interpretations, the theorem is used directly as part of a particular approach to statistical inference. In particular, with the Bayesian interpretation of probability, the theorem expresses how a subjective degree of belief should rationally change to account for evidence: this is Bayesian inference, which is fundamental to Bayesian statistics. However, Bayes' theorem has applications in a wide range of calculations involving probabilities, not just in Bayesian inference.

      Bayes' theorem is named after Rev. Thomas Bayes (1701–1761), who firstshowed how to use new evidence to update beliefs. It was further developed by Pierre-Simon Laplace, who first published the modern formulation in his 1812 Théorie analytique des probabilités. Sir Harold Jeffreys put Bayes' algorithm and Laplace's formulation on an axiomatic basis. Jeffreys wrote that Bayes' theorem "is to the theory of probability what Pythagoras's theorem is to geometry".

2011


2009

  • (Wikipedia, 2009) ⇒ http://en.wikipedia.org/wiki/Bayes'_theorem
    • "In some interpretations of probability, Bayes' theorem tells how to update or revise beliefs in light of new evidence a posteriori."
    • Each term in Bayes' theorem has a conventional name:
      • P(A) is the prior probability or marginal probability of A. It is "prior" in the sense that it does not take into account any information about B.
      • P(A|B) is the conditional probability of A, given B. It is also called the posterior probability because it is derived from or depends upon the specified value of B.
      • P(B|A) is the conditional probability of B given A.
      • P(B) is the prior or marginal probability of B, and acts as a normalizing constant.

2002

  • (Gabor Melli, 2002) ⇒ Gabor Melli. (2002). “PredictionWorks' Data Mining Glossary." PredictionWorks.
    • Bayes Theorem : Describes a useful relationship between the likelihood of a future event (posteriors) and the likelihood of a prior event (priors). Given a hypothesis [math]\displaystyle{ h }[/math] and a dataset [math]\displaystyle{ D }[/math] the likelihood that the hypothesis is correct for the dataset P(h|D) can be expressed as P(D|h)P(h)/P(D). The use of P(h), "the prior", is the source of some debate among statisticians. The theorem can be proved by application of the product rule P(h^D)=P(h|D)P(D)=P(D|h)P(h). See: Naive-Bayes Classifier.