# Conditional Independence Relation

A Conditional Independence Relation is a statistical relation that is true if and only if $\displaystyle{ \Pr(A \mid B \cap C) = \Pr(A \mid C) }$, where $\displaystyle{ Pr }$ is a conditional probability function.
• In probability theory, two events $\displaystyle{ R }$ and $\displaystyle{ B }$ are conditionally independent given a third event $\displaystyle{ Y }$ precisely if the occurrence or non-occurrence of $\displaystyle{ R }$ and the occurrence or non-occurrence of $\displaystyle{ B }$ are independent events in their conditional probability distribution given Y. In other words, $\displaystyle{ R }$ and $\displaystyle{ B }$ are conditionally independent if and only if, given knowledge of whether $\displaystyle{ Y }$ occurs, knowledge of whether $\displaystyle{ R }$ occurs provides no information on the likelihood of $\displaystyle{ B }$ occurring, and knowledge of whether $\displaystyle{ B }$ occurs provides no information on the likehood of $\displaystyle{ R }$ occurring. In the standard notation of probability theory, $\displaystyle{ R }$ and $\displaystyle{ B }$ are conditionally independent given $\displaystyle{ Y }$ if and only if $\displaystyle{ \Pr(R \cap B \mid Y) = \Pr(R \mid Y)\Pr(B \mid Y),\, }$ or equivalently, $\displaystyle{ \Pr(R \mid B \cap Y) = \Pr(R \mid Y).\, }$ Two random variables $\displaystyle{ X }$ and $\displaystyle{ Y }$ are conditionally independent given a third random variable $\displaystyle{ Z }$ if and only if they are independent in their conditional probability distribution given Z. That is, $\displaystyle{ X }$ and $\displaystyle{ Y }$ are conditionally independent given $\displaystyle{ Z }$ if and only if, given any value of $\displaystyle{ Z }$, the probability distribution of $\displaystyle{ X }$ is the same for all values of $\displaystyle{ Y }$ and the probability distribution of $\displaystyle{ Y }$ is the same for all values of X.