# Asymptotic Analysis

Learned in CS240, did a formal study of Big O.

### Smaller / Larger Asymptotic Bounds

$O$-notation: $f(n)∈O(g(n))$ ($f$ is **asymptotically bounded above** by $g)$ if there exist constants $c>0$ and $n_{0}≥0$ such that $∣f(n)∣≤c∣g(n)∣$ for all $n≥n_{0}$.

$Ω$-notation: $f(n)∈Ω(g(n))$ (f is **asymptotically bounded below** by g) if there exist constants $c>0$ and $n_{0}≥0$ such that $c∣g(n)∣≤∣f(n)∣$ for all $n≥n_{0}$.

$Θ$-notation: $f(n)∈Θ(g(n))$ ($f$ is **asymptotically tightly bounded** by $g$) if there exist constants $c_{1},c_{2}>0$ and $n_{0}≥0$ such that $c_{1}∣g(n)∣≤∣f(n)∣≤c_{2}∣g(n)∣$ for all $n≥n_{0}$.

- This is usually what we talk about in Leetcode interviews $f(n)∈Θ(g(n))⟺f(n)∈O(g(n))andf(n)∈Ω(g(n))$

Do NOT get confused

For some reason, I used to think that $Ω$ was synonymous with “best case”, $O$ was the same as “worst case” , and $Θ$ was the “average case”. This is misleading. For each case, you can calculate $Ω,O,Θ$.

Maybe the confusion comes from the Sorting Algorithms page (yea…)

**Strictly** Smaller / Larger Asymptotic Bounds

$o$-notation: $f(n)∈o(g(n))$ (f is **asymptotically strictly smaller** than $g$) if for all constants $c>0$, there exists a constant $n_{0}≥0$ such that $∣f(n)∣≤c∣g(n)∣$ for all $n≥n_{0}$.

$ω$-notation: $f(n)∈ω(g(n))$ (f is **asymptotically strictly larger** than $g$) if for all constants $c>0$, there exists a constant $n_{0}≥0$ such that $∣f(n)∣≥c∣g(n)∣$ for all $n≥n_{0}$.

Notes

- Main difference to $O$, $Ω$ is the quantifier for $c$.
- Rarely proved from first principles.

### Algebra of Order Notations

Identity rule: $f(n)∈Θ(f(n))$

Transitivity:

- If $f(n)∈O(g(n))$ and $g(n)∈O(h(n))$ then $f(n)∈O(h(n))$
- If $f(n)∈Ω(g(n))$ and $g(n)∈Ω(h(n))$ then $f(n)∈Ω(h(n))$

Maximum rules: Suppose that $f(n)>0$ and $g(n)>0$ for all $n≥n_{0}$. Then:

- $f(n)+g(n)∈O(max{f(n),g(n)})$
- $f(n)+g(n)∈Ω(max{f(n),g(n)})$