# Big-O Notation

Big O is useful to write code considering optimization.

### Competitive Programming

$n$ | Big-$O$ Upper Bound |
---|---|

$n≤10$ | $O(n!)$ |

$n≤20$ | $O(2_{n})$ |

$n≤50$ | $O(n_{5})$ |

$n≤100$ | $O(n_{4})$ |

$n≤500$ | $O(n_{3})$ |

$n≤3000$ | $O(n_{2}⋅gn)$ |

$n≤5000$ | $O(n_{2})$ |

$n≤10_{5}$ | $O(n⋅gn)∼O(n⋅n )$ |

$n≤10_{6}$ | $O(n⋅gn)∼O(n)$ |

$n≤10_{12}$ | $O(n )$ |

$n≤10_{18}$ | $O(gn)$ |

### Math

Definition from MATH119

Given two functions $f$ and $g$, we say that “$f$ is of order $g$ as $x→x_{0}$” and write $f(x)=O(g(x))asx→x_{0}$ if there exists a constant $A$ (greater than zero) such that $∣f(x)∣≤A∣g(x)∣$ on some interval around $x_{0}$ (although the point $x_{0}$ itself may be excluded from the interval, since the idea is to describe the behaviour of $f$ in the limit as we approach $x_{0}$).

In computer science, we deal with behaviour as $n→∞$ instead of $x→x_{0}$.

- Ahh, but actually, there are 3 different types of bounds ($O$, $Ω$ and $Θ$). Upper, lower, and and tight bounds. Ahh see Asymptotic Analysis, which I learned in CS240.

### Time Complexity

Common Growth Rates (from CS240):

- $O(1)$ $→$ Constant-time algorithm
- $O(gn)$ $→$ Logarithmic algorithm
- $O(n )$ $→$ Square Root algorithm
- $O(n)$ $→$ Linear Algorithm
- $O(ngn)$ $→$ Linearithmic Algorithm
- $O(ng_{k}n)$ for some constant $k$ $→$ quasi-linear Algorithm
- $O(n_{2})$ $→$ Quadratic Algorithm, usually two nested loops
- $O(n_{3})$ $→$ Cubic Algorithm, usually three nested loops
- $O(2_{n})$ $→$ Exponential Algorithm, iterates through all subsets of input elements
- $O(n!)$ $→$ Algorithm iterates through all permutations of the input elements

### Space Complexity

Similar idea to time complexity but relating to how much memory a particular problem takes to run.

### Algebra of Big O

- $kO(x_{n})=O(x_{n}),for any constantk$
- $O(x_{m})+O(x_{n})=O(x_{q}),whereq=min(m,n)$
- $O(x_{m})⋅O(x_{n})=O(x_{m+n})$
- $[O(x_{n})]_{m}=O(x_{mn})$
- $x_{n}O(x_{m}) =O(x_{m−n})$

Note that we **cannot** state a general rule for simplifying
$O(x_{n})O(x_{m}) $

### Personal Experimentation

I did a cycle cost analysis (like how much each cycle costs) while working at Ericsson, and I wanted to understand how much slower each function is.