# Euclidean Distance

Calculated using Pythagorean Theorem, where the Euclidean Distance is given by $H(x_{n},y_{n})=(x_{n}−x_{g})_{2}+(y_{n}−y_{g})_{2} $

The squared Euclidean Distance is just getting rid of the square root, i.e. $H(x_{n},y_{n})=(x_{n}−x_{g})_{2}+(y_{n}−y_{g})_{2}$

### Machine Learning

This notation is used for Computer Vision, we call it the L2 Distance / L2 Norm $d_{1}(I_{1},I_{2})=p∑ (I_{1}−I_{2})_{2} $ Serendipity: This is also called the Root Mean Square, which I saw in ECE140.

Squared L2 Norm $d_{1}(I_{1},I_{2})=p∑ (I_{1}−I_{2})_{2}$ See Loss Function to see some discussions about why we would use L1 Distance vs L2 Distance.

### Sum of Squared Differences (SSD)

This is the term that I saw from the Computer Vision world, learning from Cyrill Stachniss.

$SSD=∑_{i,j}(I(i,j)−T(i,j))_{2}$