Trading Accuracy for Time

Trading accuracy for time means speeding up a program by either skipping work or intentionally reducing accuracy, on the “close enough is good enough” principle. It’s the opposite of speculation: speculation does extra work just-in-case, this does less work. Covered in ECE459 L14.

When is this acceptable?

When a small error is invisible or tolerable (graphics, physics sims) and the cost of exactness outweighs the benefit. Build a statistical model so you know how much error you’re introducing.

Exam analogy

Skip question 3.2 to work on 4.1 (skip work), or skip error handling in 4.1 to free time (reduce accuracy). A program might support one or the other, but not both at once.

Covered techniques

  • Early Phase Termination: quit before all results come in
  • Loop Perforation: skip iterations
  • Reduced-resource computation: use less precision (float instead of double, ints for money, Google’s TPUs use <8-bit floats)

Reduced-Resource Computation Examples

Circuit analysis. Inputs are measurements with error and resistors have ±5% tolerance. Computing to 5 decimal places is pointless.

Fast inverse square root (Quake III, attributed to John Carmack). Graphics needs for vector normalization.

float FastInvSqrt(float x) {
  float xhalf = 0.5f * x;
  int i = *(int*)&x;         // evil floating point bit level hacking
  i = 0x5f3759df - (i >> 1); // what the fuck?
  x = *(float*)&i;
  x = x*(1.5f-(xhalf*x*x));
  return x;
}

Reinterpret the float as an int, apply an offset, reinterpret back, then one Newton iteration. Constant-time, around 0.175% max error. Now mostly obsoleted by dedicated CPU instructions.

N-body simulation with bins. Force drops off with distance. Divide space into cubic bins, compute centre of mass per bin, use centre-of-mass for faraway bins and exact forces for the current bin plus 26 neighbours.

Lecture measurement: 100k points, 162 s → 147 s without threads. With rayon parallel iterator, both naive and binned versions hit around 25 s: parallelism dominates the approximation savings. On big enough problems everything counts.