# Gauss-Newton Method

This is a second-order method. Compared with Newton’s method, Gauss-Newton uses $JJ_{T}$ as the approximation of the Hessian Matrix.

As a reminder from Non-Linear Least Squares, our goal is to minimize $21 ∥f(x)∥_{2}$ in the least squares formulation, which we will proceed by finding a $Δx$ such that $21 ∥f(x+Δx)∥_{2}$ is minimized.

$f(x+Δx)≈f(x)+J(x)_{T}Δx$

We take the derivative of the above formula with respect to $Δx$ and set it to zero (this is essentially linearizing the problem): $J(x)f(x)+J(x)J_{T}(x)Δx=0$

Rearranging the terms, we get $J(x)J_{T}(x)Δx=−J(x)f(x)$ So we have

- $H=JJ_{T}$
- $g=−Jf$

Wait… but $f$ isn’t linear, how is $g$ computed? Oh this is fine, it’s just a multiplication that you expand, right? - $J$ is $n×n$, $f$ is $n×1$, so $g$ is $n×1$

Gauss-Newton Method

- Give an initial value $x_{0}$.
- For $k$-th iteration, calculate $J(x_{k})$ and residual $f(x_{k})$.
- Solve the normal equation $HΔx_{k}=g$ (solve through Gaussian Elimination).
- If $Δx_{k}$ is small enough, stop the algorithm. Otherwise, let $x_{k+1}=x_{k}+Δx_{k}$ and return to step 2.