Statistical Independence
Two events $A$ and $B$ are independent if and only if
$P(A∩B)=P(A)⋅P(B)$

An event with probability 0 is defined to be independent of every event (including itself).

MULTIPLY: $P(A∩B)=P(A∣B)⋅P(B)$ → Conditional Probability
ADD: $P(A∪B)=P(A)+P(B)−P(A∩B)$ → Inclusion-Exclusion Principle

title: [[Independence (Statistics)]] != [[Mutual Exclusivity]]
Do not confuse statistical independence with mutual exclusivity. To understand this, think about a coin toss example.
- When you toss a single coin, there are two events that can occur: either heads or tails. These events are mutually exclusive one another, but NOT independent.
- On the other hand, if you take two separate coins and flip them, the occurence of head or tails on both coins are [[Independence (Statistics)|Independent]] from one another, but NOT mutually exclusive

title: Definition
Events $A$, $B$, and $C$ are independent if:
1. $P(A \cap B \cap C) = P(A) \cdot P(B) \cdot P(C)$
2. Each set of pairwise events is independent.

Go with the mathematical check, not your intuition. If your mathematical definition holds, then these two events are independent.

title: [[Independence (Statistics)]] of [[Random Variable]]s
Random variables $X$ and $Y$ are independent if:
$$P(X \leq x, Y \leq y) = P(X \leq x) \cdot P(Y \leq y), \forall x, y \in \mathbb{R}$$

For the Discrete case, we have that
$P(X=x,Y=y)=P(X=x)⋅P(Y=y),∀x,y$
For Three or more, we have the same generalization
$P(X≤x,Y≤y,Z≤z)=P(X≤x)⋅P(Y≤y)⋅P(Z≤z),∀x,y,z∈R$

Conditional Independence
From CS287 , this might seem confusing, but it is the same idea as everything we have done above, reminder that $P(x,y)$ is the same $P(x∩y)$ .

$P(x,y∣z)=P(x∣z)P(y∣z)$