Gustafson’s Law
Gustafson’s Law is the optimistic counterpart to Amdahl’s Law. It holds runtime constant and asks how big a problem you can solve.
Amdahl’s ceiling only applies when problem size is fixed. Gustafson (1988) argued that the point of a faster computer is to attack bigger problems.
Key observation
Scaling up the problem usually grows the parallel part. The serial part stays roughly constant. So as you scale, the serial fraction shrinks, and more processors can tackle a bigger problem.
This works whenever there’s a problem-size knob:
- grid resolution
- number of timesteps
- dataset size
- queries per second
Google handling web-scale traffic is the canonical example.
When Gustafson doesn't save you
The law assumes the serial part stays constant as the problem grows. If the serial part also scales (e.g. a global reduction over the new data), you’re back to an Amdahl-style ceiling. It also assumes the bigger problem is one you actually want to solve.