# Least Squares

This is basically Line of Best Fit. First Saw this in the Kalman Filter book. Learning more seriously from Cyrill Stachniss in the context of SLAM.

Resources

Least Squares vs. SVD?

SVD (Singular Value Decomposition): A mathematical technique for factorizing a matrix into three separate matrices. Used in various applications, including dimensionality reduction, noise reduction, and solving linear equations.

Least Squares: A method for estimating the coefficients in a linear regression model. Minimizes the sum of the squared differences between observed and predicted values.

Kajanan taught me linear least squares. That is, if you can represent the error term $e(x)$ as $e(x)=Ax−b$

Then, you know how to minimize it, by setting the derivative of the error to 0, i.e.: $dxdE =0$