# Conditional Independence Relation

(Redirected from Conditional independence)
A Conditional Independence Relation is a statistical relation that is true if and only if $\Pr(A \mid B \cap C) = \Pr(A \mid C)$, where $Pr$ is a conditional probability function.
• In probability theory, two events $R$ and $B$ are conditionally independent given a third event $Y$ precisely if the occurrence or non-occurrence of $R$ and the occurrence or non-occurrence of $B$ are independent events in their conditional probability distribution given Y. In other words, $R$ and $B$ are conditionally independent if and only if, given knowledge of whether $Y$ occurs, knowledge of whether $R$ occurs provides no information on the likelihood of $B$ occurring, and knowledge of whether $B$ occurs provides no information on the likehood of $R$ occurring. In the standard notation of probability theory, $R$ and $B$ are conditionally independent given $Y$ if and only if $\Pr(R \cap B \mid Y) = \Pr(R \mid Y)\Pr(B \mid Y),\,$ or equivalently, $\Pr(R \mid B \cap Y) = \Pr(R \mid Y).\,$ Two random variables $X$ and $Y$ are conditionally independent given a third random variable $Z$ if and only if they are independent in their conditional probability distribution given Z. That is, $X$ and $Y$ are conditionally independent given $Z$ if and only if, given any value of $Z$, the probability distribution of $X$ is the same for all values of $Y$ and the probability distribution of $Y$ is the same for all values of X.