1987 EngineeringStatistics

From GM-RKB
Jump to navigation Jump to search

Subject Headings: Statistics Textbook.

Notes

Quotes

{{#ifanon:|

2 Probability Models

2.1 Probability

In applied mathematics we are usually concerned with either deterministic or probabilistic models, although in many instances these are intertwined.

... a deterministic model because everything is known once … conditions are specified.

Random experiments have outcomes that cannot be determined with certainly before the experiments are performed...

  • The collection of all possible outcomes, namely [math]\displaystyle{ S }[/math] = {H,T}, is called the sample space. Suppose that we are interested in a subset [math]\displaystyle{ A }[/math] of our sample space; for example, in our case, let A={H} represent heads. Repeat this random experiment a number of times, say [math]\displaystyle{ n }[/math], and count the number of times, say [math]\displaystyle{ f }[/math], that the experiment ended in A. Here [math]\displaystyle{ f }[/math] is called the frequency of the event A and the ratio f/n is called the relative frequency of the event [math]\displaystyle{ A }[/math] in the [math]\displaystyle{ n }[/math] trials of the experiment.

2.4 Multivariate Distributions

... We start our discussion by considering the probabilities that are associated with two random variables, [math]\displaystyle{ X }[/math] and Y. We call the probability function

[math]\displaystyle{ f }[/math](x, y) = P(X=x, Y=y), (x,y) R,

where [math]\displaystyle{ R }[/math] is the space of (X, Y), the [[Joint Probability Density Function|joint probability density function of [math]\displaystyle{ X }[/math] and [math]\displaystyle{ Y }[/math] or simply the joint density of [math]\displaystyle{ X }[/math] and Y. …

Probabilities such as … are called marginal probabilities because they are usually recorded in the margins of a joint probability table. …

In Example 2.4-1 we have illustrated the computation of the marginal probabilities

[math]\displaystyle{ f }[/math]1(x) = y [math]\displaystyle{ f }[/math](x,y) = y P(X=x, Y=y) = P(X=x).

and

[math]\displaystyle{ f }[/math]1(y) = x [math]\displaystyle{ f }[/math](x,y) = x P(X=x, Y=y) = P(Y=y).

We call [math]\displaystyle{ f }[/math]1(x) and [math]\displaystyle{ f }[/math]2(y) the [[Marginal Probability Density Function|marginal probability density function of [math]\displaystyle{ X }[/math] and [math]\displaystyle{ Y }[/math], respectively. If

[math]\displaystyle{ f }[/math](x,y) = P(X=x, Y=y) = P(X=x) P(Y=y) = [math]\displaystyle{ f }[/math]1x)f2(y),

for all [math]\displaystyle{ x }[/math] and [math]\displaystyle{ y }[/math], then [math]\displaystyle{ X }[/math] and [math]\displaystyle{ Y }[/math] are said to be independent random variables. That is, the joint p.d.f. of independent random variables is the product of the marginal densities. In Section 2.3 we consider several mutually independent random variables, say X1,X2, ..., Xn, each with the same marginal probability density function[math]\displaystyle{ f }[/math](x). Recall that in these situations X1, X2, ..., Xnare called observations of a random sample, and their [[joint probability density function is the product of the marginals, namely,

[math]\displaystyle{ f }[/math](x1)f(x2...f(xn).

If random variables are not independent, they are said to be dependent.

}},


 AuthorvolumeDate ValuetitletypejournaltitleUrldoinoteyear
1987 EngineeringStatisticsRobert V. Hogg
Johannes Ledolter
Engineering Statisticshttp://books.google.com/books?id=CBWRQgAACAAJ1987