# Independence

Tags |
---|

# Independence

## Tricky thing on mutual independence

If a set of random variables $X_1, ..., X_n$ are mutually independent, then it implies that they are pairwise independent. However, it is NOT the other way around! If everything is pairwise independent, it may not be the case that they are mutually independent. It is possible to find a counterexample.

## Counterexample

Consider the result of two coin flips. Let $X$ be “heads first” and $Y$ be “heads second” and $Z$ be “two heads or two tails”. Now, it is obvious that X and Y are independent. It is also true that $Z$ is independent of $X$, because regardless of the value of $X$, there is an equal chance for same result or different result, which is what $Z$ is looking for. Same logic holds for $Y$. However, it is not the case that $X, Y, Z$ are mutually independent. If we know that $X$ is true and $Y$ is true, then it is necessarily true that $Z$ is true. It’s like the XOR problem we talk about in the bayesian network section.

## Conditional independence

We talk about two events being `conditional independent`

if there is some event $\gamma$ that makes $\alpha, \beta$ independent (whereas they might be dependent otherwise). A good example is $\alpha =$ “AC turns on” and $\beta =$ “thermometer goes up” and $\gamma =$ “it gets hotter”. As mentioned in our discussion on graphical models, $\alpha, \beta$ are totally dependent on each other! However, they are united under a common cause (things get hotter), and therefore when we know $\gamma$, then $\alpha$ and $\beta$ don’t provide any more information to each other.