Random Variable Function

From GM-RKB
(Redirected from Random Variable)
Jump to navigation Jump to search

A Random Variable Function ([math]\displaystyle{ X }[/math]) is a measurable real-valued abstract function (random element) that represents some abstract random experiment by assigning a real number to each outcome ([math]\displaystyle{ \omega }[/math]) in the sample space ([math]\displaystyle{ \Omega }[/math]).



References

2015

  • (Wikipedia, 2015) ⇒ http://en.wikipedia.org/wiki/Random_element#Random_variable Retrieved:2015-5-16.
    • A random variable is the simplest type of random element. It is a map [math]\displaystyle{ X\colon \Omega \to \mathbb{R} }[/math] is a measurable function from the set of possible outcomes [math]\displaystyle{ \Omega }[/math] to [math]\displaystyle{ \mathbb{R} }[/math].

      As a real-valued function, [math]\displaystyle{ X }[/math] often describes some numerical quantity of a given event. E.g. the number of heads after a certain number of coin flips; the heights of different people.

      When the image (or range) of [math]\displaystyle{ X }[/math] is finite or countably infinite, the random variable is called a discrete random variable[1] and its distribution can be described by a probability mass function which assigns a probability to each value in the image of [math]\displaystyle{ X }[/math]. If the image is uncountably infinite then [math]\displaystyle{ X }[/math] is called a continuous random variable. In the special case that it is absolutely continuous, its distribution can be described by a probability density function, which assigns probabilities to intervals; in particular, each individual point must necessarily have probability zero for an absolutely continuous random variable. Not all continuous random variables are absolutely continuous,[2] for example a mixture distribution. Such random variables cannot be described by a probability density or a probability mass function.

  1. Yates, Daniel S.; Moore, David S; Starnes, Daren S. (2003). The Practice of Statistics (2nd ed.). New York: Freeman. ISBN 978-0-7167-4773-4. http://bcs.whfreeman.com/yates2e/. 
  2. L. Castañeda, V. Arunachalam, and S. Dharmaraja (2012). Introduction to Probability and Stochastic Processes with Applications. Wiley. p. 67. http://books.google.com/books?id=zxXRn-Qmtk8C&pg=PA67. 

2015

  • (Wikipedia, 2015) ⇒ http://en.wikipedia.org/wiki/random_variable Retrieved:2015-5-16.
    • In probability and statistics, a random variable, aleatory variable or stochastic variable is a variable whose value is subject to variations due to chance (i.e. randomness, in a mathematical sense). A random variable can take on a set of possible different values (similarly to other mathematical variables), each with an associated probability, in contrast to other mathematical variables.

      A random variable's possible values might represent the possible outcomes of a yet-to-be-performed experiment, or the possible outcomes of a past experiment whose already-existing value is uncertain (for example, due to imprecise measurements or quantum uncertainty). They may also conceptually represent either the results of an "objectively" random process (such as rolling a die) or the "subjective" randomness that results from incomplete knowledge of a quantity. The meaning of the probabilities assigned to the potential values of a random variable is not part of probability theory itself but is instead related to philosophical arguments over the interpretation of probability. The mathematics works the same regardless of the particular interpretation in use.

      The mathematical function describing the possible values of a random variable and their associated probabilities is known as a probability distribution. Random variables can be discrete, that is, taking any of a specified finite or countable list of values, endowed with a probability mass function, characteristic of a probability distribution; or continuous, taking any numerical value in an interval or collection of intervals, via a probability density function that is characteristic of a probability distribution; or a mixture of both types. The realizations of a random variable, that is, the results of randomly choosing values according to the variable's probability distribution function, are called random variates.

      The formal mathematical treatment of random variables is a topic in probability theory. In that context, a random variable is understood as a function defined on a sample space whose outputs are numerical values.


2015

  • (Wikipedia, 2015) ⇒ http://en.wikipedia.org/wiki/random_variable#Definition Retrieved:2015-5-16.
    • A random variable [math]\displaystyle{ X\colon \Omega \to E }[/math] is a measurable function from the set of possible outcomes [math]\displaystyle{ \Omega }[/math] to some set [math]\displaystyle{ E }[/math] . Usually, [math]\displaystyle{ E = }[/math] [math]\displaystyle{ \mathbb{R} }[/math], otherwise the term random element is used instead (see Extensions). The technical axiomatic definition requires both [math]\displaystyle{ \Omega }[/math] and [math]\displaystyle{ E }[/math] to be measurable spaces (see Measure-theoretic definition).

      As a real-valued function, [math]\displaystyle{ X }[/math] often describes some numerical quantity of a given event. E.g. the number of heads after a certain number of coin flips; the heights of different people.

      When the image (or range) of [math]\displaystyle{ X }[/math] is finite or countably infinite, the random variable is called a discrete random variable and its distribution can be described by a probability mass function which assigns a probability to each value in the image of [math]\displaystyle{ X }[/math] . If the image is uncountably infinite then [math]\displaystyle{ X }[/math] is called a continuous random variable. In the special case that it is absolutely continuous, its distribution can be described by a probability density function, which assigns probabilities to intervals; in particular, each individual point must necessarily have probability zero for an absolutely continuous random variable. Not all continuous random variables are absolutely continuous, for example a mixture distribution. Such random variables cannot be described by a probability density or a probability mass function.

      All random variables can be described by their cumulative distribution function, which describes the probability that the random variable will be less than or equal to a certain value.

2008

2006

  • (Dubnicka, 2006c) ⇒ Suzanne R. Dubnicka. (2006). “Random Variables - STAT 510: Handout 3." Kansas State University, Introduction to Probability and Statistics I, STAT 510 - Fall 2006.
    • MATHEMATICAL DEFINITION: A random variable [math]\displaystyle{ X }[/math] is a function whose domain is the sample space S and whose range is the set of real numbers [math]\displaystyle{ R }[/math] = {x : −∞ < [math]\displaystyle{ x }[/math] < ∞.}. Thus, a random is obtained by assigning a numerical value to each outcome of a particular experiment.
    • WORKING DEFINITION: A random variable is a variable whose observed value is determined by chance.
    • NOTATION: We denote a random variable [math]\displaystyle{ X }[/math] with a capital letter; we denote an observed value of [math]\displaystyle{ X }[/math] as [math]\displaystyle{ x }[/math], a lowercase letter.
    • TERMINOLOGY : The support of a random variable X is set of all possible values that [math]\displaystyle{ X }[/math] can assume. We will often denote the support set as SX. If the random variable [math]\displaystyle{ X }[/math] has a support set SX that is either finite or countable, we call X a discrete random variable.
    • The pmf of a discrete random variable and the pdf of a continuous random variable provides complete information about the probabilistic properties of a random variable. However, it is sometimes useful to employ summary measures. The most basic summary measure is the expectation or mean of a random variable X, denoted E(X), which can be thought of as an “average” value of a random variable.
    • TERMINOLOGY : Let X be a discrete random variable with pmf pX(x) and support SX. The expected value of X is given by E(X) = X x∈SX xpX(x).

1986

  • (Larsen & Marx, 1986) ⇒ Richard J. Larsen, and Morris L. Marx. (1986). “An Introduction to Mathematical Statistics and Its Applications, 2nd edition." Prentice Hall
    • QUOTE: ... The revised sample space contains 11 outcomes, but the latter are not equally likely. … In general, rules for redefining samples spaces - like going from (x, y's to (x + y)'s are called random variables. As a conceptual framework, random variables are of fundamental importance: they provide a single rubric under which all probability problems may be brought. Even in cases where the original sample space needs no redefinition - that is, where the measurement recorded is the measurement of interests - the concept still applies: we simply take the random variable to be the identify mapping.
    • 'Definition 3.2.1. A real-valued function whose domain is the sample space S is called a random variable. We denote random variables by uppercase letters, often [math]\displaystyle{ X }[/math], Y, or Z.
    • If the range of the mapping contains either a finite or countably infinite number of values, the random variable is said to be discrete ; if the range includes an interval of real numbers, bounded or unbounded, the random variable is said to be continuous.
    • Associated with each discrete random variable [math]\displaystyle{ Y }[/math] is a probability density function (or pdf). “fY(y). By definition, fY(y) is the sum of all the probabilities associated with outcomes in [math]\displaystyle{ S }[/math] that get mapped into [math]\displaystyle{ y }[/math] by the random variable Y. That is.
      • fY(y) = P({s(∈)S |Y(s) = y})
    • Conceptually, fY(y) describes the probability structure induced on the real line by the random variable Y.
    • For notational simplicity, we will delete all references to [math]\displaystyle{ s }[/math] and [math]\displaystyle{ S }[/math] and write: fY(y) = P(Y(s)=y). In other words, fY(y) is the "probability that the random variable Y takes on the value y."
    • Associated with each continuous random variable [math]\displaystyle{ Y }[/math] is also a probability density function, fY(y), but fY(y) in this case is not the probability that the random variable [math]\displaystyle{ Y }[/math] takes on the value y. Rather, fY(y) is a continuous curve having the property that for all [math]\displaystyle{ a }[/math] and [math]\displaystyle{ b }[/math],
      • P(a[math]\displaystyle{ Y }[/math]b) = P({s(∈)S| [math]\displaystyle{ a }[/math]Y(s) ≤ b}) = Integral(a,b). “fY(y) dy]

1980


.