# Conditional Independence Relation

(Redirected from Conditional independence)

A Conditional Independence Relation is a statistical relation that is true if and only if [math]\Pr(A \mid B \cap C) = \Pr(A \mid C)[/math], where [math]Pr[/math] is a conditional probability function.

**AKA:**CI, [math]CI(A,B,C)[/math].- …

**Counter-Example(s):****See:**Conditional, Independence Relation, Statistical Independence Relation.

## References

### 2011

- http://en.wikipedia.org/wiki/Conditional_independence
- In probability theory, two events [math]R[/math] and [math]B[/math] are
**conditionally independent**given a third event [math]Y[/math] precisely if the occurrence or non-occurrence of [math]R[/math]*and*the occurrence or non-occurrence of [math]B[/math] are independent events in their conditional probability distribution given*Y*. In other words, [math]R[/math] and [math]B[/math] are conditionally independent if and only if, given knowledge of whether [math]Y[/math] occurs, knowledge of whether [math]R[/math] occurs provides no information on the likelihood of [math]B[/math] occurring, and knowledge of whether [math]B[/math] occurs provides no information on the likehood of [math]R[/math] occurring. In the standard notation of probability theory, [math]R[/math] and [math]B[/math] are conditionally independent given [math]Y[/math] if and only if [math]\Pr(R \cap B \mid Y) = \Pr(R \mid Y)\Pr(B \mid Y),\,[/math] or equivalently, [math]\Pr(R \mid B \cap Y) = \Pr(R \mid Y).\,[/math] Two random variables [math]X[/math] and [math]Y[/math] are**conditionally independent**given a third random variable [math]Z[/math] if and only if they are independent in their conditional probability distribution given*Z*. That is, [math]X[/math] and [math]Y[/math] are conditionally independent given [math]Z[/math] if and only if, given any value of [math]Z[/math], the probability distribution of [math]X[/math] is the same for all values of [math]Y[/math] and the probability distribution of [math]Y[/math] is the same for all values of*X*.

- In probability theory, two events [math]R[/math] and [math]B[/math] are

### 2005

- (Rue & Held, 2005) ⇒ Havard Rue, and Leonhard Held. (2005). “Gaussian Markov Random Fields: Theory and Applications." CRC Press. ISBN:1584884320
- QUOTE: Conditional independence is a powerful concept. Let [math]\mathbf{x} = (x_1,x_2,x_3)^T[/math] be a random vector, then [math]x_1[/math] and [math]x_2[/math] are conditionally independent given [math]x_3[/math] if, for known value of [math]x_3[/math], discover [math]x_2[/math] tells you nothing new about the distribution of [math]x_1[/math].