Statistical Independence

Two events and are independent if and only if

An event with probability 0 is defined to be independent of every event (including itself).

MULTIPLY: Conditional Probability ADD: Inclusion-Exclusion Principle

title: [[Independence (Statistics)]] != [[Mutual Exclusivity]]
Do not confuse statistical independence with mutual exclusivity. To understand this, think about a coin toss example.
 
- When you toss a single coin, there are two events that can occur: either heads or tails. These events are mutually exclusive one another, but NOT independent.
- On the other hand, if you take two separate coins and flip them, the occurence of head or tails on both coins are [[Independence (Statistics)|Independent]] from one another, but NOT mutually exclusive
title: Definition
Events $A$, $B$, and $C$ are independent if:
1. $P(A \cap B \cap C) = P(A) \cdot P(B) \cdot P(C)$
2. Each set of pairwise events is independent.

Go with the mathematical check, not your intuition. If your mathematical definition holds, then these two events are independent.

Independence For Random Variable

title: [[Independence (Statistics)]] of [[Random Variable]]s
Random variables $X$ and $Y$ are independent if: 
$$P(X \leq x, Y \leq y) = P(X \leq x) \cdot P(Y \leq y), \forall x, y \in \mathbb{R}$$

For the Discrete case, we have that For Three or more, we have the same generalization

Conditional Independence

From CS287, this might seem confusing, but it is the same idea as everything we have done above, reminder that is the same .