Greedy Algorithms, Dynamic Programming

# Interval Scheduling

I really like the notes by Lapchi on this. Very intuitive.

https://cs.uwaterloo.ca/~lapchi/cs341/notes/L11.pdf

Donβt focus on the algorithm, but understanding rather why it is correct. And then, be able to come up with the algorithm fast.

### Interval Scheduling

Interval Scheduling Problem

Input: $n$ intervals $[s_{1},f_{1}],[s_{2},f_{2}],...,[s_{n},f_{n}]$.Output: a maximal subset of disjoint intervals.

There are a few greedy ideas that donβt work:

- Choose earliest starting time
- Counter-example: [1,10], [2,5], [6,8]

- Choose shortest intervals first
- Counter-example:

- Choose intervals with the minimum number of overlaps first
- Counter-example:

The proof here is by assuming an optimal solution $O$, and comparing to the output set produced by the greedy algorithm $S$.

And then assume that $β£Sβ£<β£Oβ£$.

Let $j_{r}$ be an index into $O$, and $i_{r}$ be an index into $S$.

At $r=0$, $f(i_{r})β€f(j_{r})$ (since the greedy algorithm takes interval with the earliest finish time) At Inductive hypothesis: $f(i_{rβ1})β€f(j_{rβ1})$ $f(j_{rβ1})<s(j_{r})$ Therefore, $f(i_{rβ1})<s(j_{r})$

So $j_{r}$ is an interval that is considered by the greedy algorithm. On the next interval $i_{r}$ that it selects, itβs either this interval, or an even earlier one. $f(i_{r})β€f(j_{r})$

Now, let $k$ be the last interval that is in $O$ and $S$. We have $f(i_{k})β€f(j_{k})$ Since O is bigger, there must exist the interval $j_{k+1}$. we know that $f(j_{k})<s(j_{k+1})$

However, we know that $f(i_{k})<s(j_{k+1})$ So this interval must be considered by the greedy algorithm. This. is a contradiction.