# 1987 EngineeringStatistics

- (Hogg & Ledolter, 1987) ⇒ Robert V. Hogg, Johannes Ledolter. (1987). “Engineering Statistics." Macmillan Publishing. ISBN:0023557907

**Subject Headings:** Statistics Textbook.

## Notes

## Quotes

{{#ifanon:|

### 2 Probability Models

### 2.1 Probability

In applied mathematics we are usually concerned with either *deterministic* or *probabilistic* models, although in many instances these are intertwined.

... a *deterministic model* because everything is known once … conditions are specified.

…

*Random experiments* have *outcomes* that cannot be determined with certainly before the experiments are performed...

- The collection of all possible outcomes, namely [math]\displaystyle{ S }[/math] = {H,T}, is called the
*sample space*. Suppose that we are interested in a subset [math]\displaystyle{ A }[/math] of our sample space; for example, in our case, let*A*={H} represent heads. Repeat this random experiment a number of times, say [math]\displaystyle{ n }[/math], and count the number of times, say [math]\displaystyle{ f }[/math], that the experiment ended in*A*. Here [math]\displaystyle{ f }[/math] is called the*frequency*of the*event A*and the ratio f/n is called the*relative frequency*of the event [math]\displaystyle{ A }[/math] in the [math]\displaystyle{ n }[/math] trials of the experiment.

### 2.4 Multivariate Distributions

... We start our discussion by considering the probabilities that are associated with two random variables, [math]\displaystyle{ X }[/math] and *Y*. We call the probability function

- [math]\displaystyle{ f }[/math](
*x*,*y*) =*P*(*X*=*x*,*Y*=*y*), (*x*,*y) ∈*R*,*

where [math]\displaystyle{ R }[/math] is the space of (*X*, *Y*), the [[Joint Probability Density Function|joint probability density function of [math]\displaystyle{ X }[/math] and [math]\displaystyle{ Y }[/math] or simply the joint density of [math]\displaystyle{ X }[/math] and *Y*.
…

Probabilities such as … are called *marginal probabilities* because they are usually recorded in the margins of a joint probability table.
…

In Example 2.4-1 we have illustrated the computation of the marginal probabilities

- [math]\displaystyle{ f }[/math]
_{1}(*x*) = ∑[math]\displaystyle{ f }[/math](x_{y}*,*y*) = ∑*_{y}*P*(*X*=*x*,*Y*=*y*) =*P*(*X*=*x*).

and

- [math]\displaystyle{ f }[/math]
_{1}(*y*) = ∑[math]\displaystyle{ f }[/math](x_{x}*,*y*) = ∑*_{x}*P*(*X*=*x*,*Y*=*y*) =*P*(*Y*=*y*).

We call [math]\displaystyle{ f }[/math]_{1}(*x*) and [math]\displaystyle{ f }[/math]_{2}(*y*) the [[Marginal Probability Density Function|marginal probability density function of [math]\displaystyle{ X }[/math] and [math]\displaystyle{ Y }[/math], respectively. If

- [math]\displaystyle{ f }[/math](
*x*,*y*) =*P*(*X*=*x*,*Y*=*y*) =*P*(*X*=*x*)*P*(*Y*=*y*) = [math]\displaystyle{ f }[/math]_{1}*x*)*f*_{2}(*y*),

for all [math]\displaystyle{ x }[/math] and [math]\displaystyle{ y }[/math], then [math]\displaystyle{ X }[/math] and [math]\displaystyle{ Y }[/math] are said to be independent random variables. That is, the joint p.d.f. of independent random variables is the product of the marginal densities. In Section 2.3 we consider several *mutually independent random variables*, say *X*_{1},*X*_{2}, ..., *X*_{n}, each with the same marginal probability density function[math]\displaystyle{ f }[/math](*x*). Recall that in these situations *X*_{1}, *X*_{2}, ..., *X _{n}*are called observations of a random sample, and their [[joint probability density function is the product of the marginals, namely,

- [math]\displaystyle{ f }[/math](
*x*_{1})f*(*x_{2}...*f*(*x*_{n}).

If random variables are not independent, they are said to be *dependent*.

}},

Author | volume | Date Value | title | type | journal | titleUrl | doi | note | year | |
---|---|---|---|---|---|---|---|---|---|---|

1987 EngineeringStatistics | Robert V. Hogg Johannes Ledolter | Engineering Statistics | http://books.google.com/books?id=CBWRQgAACAAJ |