Probability Measure

From GM-RKB
(Redirected from Odds)
Jump to navigation Jump to search

A probability measure is a real-valued measure on events which assigns a nonnegative probability value to every set in a sigma-field (a collection of subsets of a sample space).

(a) [math]\displaystyle{ 0 \leq P(A) \leq 1 }[/math] for all subsets [math]\displaystyle{ A \; \in \; \mathcal{F} }[/math]
(b) [math]\displaystyle{ P(\emptyset)=0 }[/math] and [math]\displaystyle{ P(\Omega)=1 }[/math]
(c) If [math]\displaystyle{ \{A_1,A_2,A_3,\cdots \} }[/math] is a sequence of disjoint sets (i.e. [math]\displaystyle{ A_i \cap A_j = \emptyset }[/math] whenever [math]\displaystyle{ i\neq j }[/math]) that belong to [math]\displaystyle{ \mathcal{F} }[/math], then [math]\displaystyle{ P(\cup_iA_i)= \sum^{\infty}_{i=1} P(A_i) }[/math].


References

2022

  • (Wikipedia, 2022) ⇒ https://en.wikipedia.org/wiki/Probability_measure Retrieved:2022-1-11.
    • In mathematics, a probability measure is a real-valued function defined on a set of events in a probability space that satisfies measure properties such as countable additivity. [1] The difference between a probability measure and the more general notion of measure (which includes concepts like area or volume) is that a probability measure must assign value 1 to the entire probability space.

      Intuitively, the additivity property says that the probability assigned to the union of two disjoint events by the measure should be the sum of the probabilities of the events, e.g. the value assigned to "1 or 2" in a throw of a die should be the sum of the values assigned to "1" and "2".

      Probability measures have applications in diverse fields, from physics to finance and biology.

2015

  • (Wikipedia, 2015) ⇒ http://en.wikipedia.org/wiki/probability_measure Retrieved:2015-6-4.
    • In mathematics, a probability measure is a real-valued function defined on a set of events in a probability space that satisfies measure properties such as countable additivity. [2] The difference between a probability measure and the more general notion of measure (which includes concepts like area or volume) is that a probability measure must assign value 1 to the entire probability space.

      Intuitively, the additivity property says that the probability assigned to the union of two disjoint events by the measure should be the sum of the probabilities of the events, e.g. the value assigned to "1 or 2" in a throw of a die should be the sum of the values assigned to "1" and "2".

      Probability measures have applications in diverse fields, from physics to finance and biology.

  1. An introduction to measure-theoretic probability by George G. Roussas 2004 page 47
  2. An introduction to measure-theoretic probability by George G. Roussas 2004 ISBN 0-12-599022-7 page 47

2013

2009

2008

(a) [math]\displaystyle{ \Omega }[/math] is the sample space, the set of possible outcomes of the experiment.
(b) [math]\displaystyle{ \mathcal{F} }[/math] is a σ-field, a collection of subsets of [math]\displaystyle{ \Omega }[/math].
(c) [math]\displaystyle{ \mathbb{P} }[/math] is a probability measure, a function that assigns a nonnegative probability to every set in the σ-field F.
(...) Let [math]\displaystyle{ (\Omega, \mathcal{F}) }[/math] be a measurable space. A measure is a function [math]\displaystyle{ \mu \; :\; \mathcal{F} \rightarrow [0, +\infty] }[/math], which assigns a nonnegative extended real number [math]\displaystyle{ \mu(A) }[/math] to every set [math]\displaystyle{ A }[/math] in [math]\displaystyle{ \mathcal{F} }[/math], and which satisfies the following two conditions:
(a) [math]\displaystyle{ \mu(\emptyset)=0 }[/math];
(b) (Countable additivity) If [math]\displaystyle{ \{A_i\} }[/math] is a sequence of disjoint sets that belong to [math]\displaystyle{ \mathcal{F} }[/math], then [math]\displaystyle{ \mu(\cup_iA_i) = \sum^{\infty}_{i=1} \mu(A_i) }[/math].
A probability measure is a measure [math]\displaystyle{ \mathbb{P} }[/math] with the additional property [math]\displaystyle{ \mathbb{P}(\Omega)= 1 }[/math]. In that case, the triple \[math]\displaystyle{ (\Omega, \mathcal{F}, \mathbb{P}) }[/math] is called a probability space.

1991

1986

  • (Larsen & Marx, 1986) ⇒ Richard J. Larsen, and Morris L. Marx. (1986). “An Introduction to Mathematical Statistics and Its Applications, 2nd edition." Prentice Hall
    • QUOTE: Consider a sample space, [math]\displaystyle{ S }[/math], and any event, [math]\displaystyle{ A }[/math], defined on [math]\displaystyle{ S }[/math]. If our experiment were performed one time, either [math]\displaystyle{ A }[/math] or [math]\displaystyle{ A^C }[/math] would be the outcome. If it were performed [math]\displaystyle{ n }[/math] times, the resulting set of sample outcomes would be members of [math]\displaystyle{ A }[/math] on [math]\displaystyle{ m }[/math] occasions, [math]\displaystyle{ m }[/math] being some integer between [math]\displaystyle{ 1 }[/math] and [math]\displaystyle{ n }[/math], inclusive. Hypothetically, we could continue this process an infinite number of times. As [math]\displaystyle{ n }[/math] gets large, the ratio m/n will fluctuate less and less (we will make that statement more precise a little later). The number that m/n convert to is called the empirical probability of [math]\displaystyle{ A }[/math] : that is, [math]\displaystyle{ P(A) = lim_{n → ∞}(m/n) }[/math]. … the very act of repeating an experiment under identical conditions an infinite number of times is physically impossible. And left unanswered is the question of how large [math]\displaystyle{ n }[/math] must be to give a good approximation for [math]\displaystyle{ lim_{n → ∞}(m/n) }[/math].

      The next attempt at defining probability was entirely a product of the twentieth century. Modern mathematicians have shown a keen interest in developing subjects axiomatically. It was to be expected, then, that probability would come under such scrutiny … The major breakthrough on this front came in 1933 when Andrei Kolmogorov published Grundbegriffe der Wahscheinlichkeitsrechnung (Foundations of the Theory of Probability.). Kolmogorov's work was a masterpiece of mathematical elegance - it reduced the behavior of the probability function to a set of just three or four simple postulates, three if the same space is limited to a finite number of outcomes and four if [math]\displaystyle{ S }[/math] is infinite.

1933

  • (Kolmogorov, 1933) ⇒ Andrei Kolmogorov. (1933). “Grundbegriffe der Wahrscheinlichkeitsrechnung (Foundations of the Theory of Probability.). American Mathematical Society. ISBN:0828400237
    • QUOTE: … Every distributions function [math]\displaystyle{ F_{\mu_1 \mu_2 … \mu_n} }[/math], satisfying the general conditions of Chap. II, Sec 3, III and also conditions (2) and (3). Every distribution function [math]\displaystyle{ F_{\mu_1 \mu_2 … \mu_n} }[/math] defines uniquely a corresponding probability function [math]\displaystyle{ \text{P}_{\mu_1 \mu_2 … \mu_n} }[/math] for all Borel sets of [math]\displaystyle{ R^n }[/math].