# Harris Corner

Learning from Cyrill Stachniss.

Resources

- Learned from the Visual Feature lecture: https://www.youtube.com/watch?v=nGya59Je4Bs&list=PLgnQpQtFTOGRYjqjdZxTEQPZuFHQa7O7Y&index=14&ab_channel=CyrillStachniss
- slides here

- Wikipedia also has solid formulas: https://en.wikipedia.org/wiki/Harris_corner_detector
- Slides from CMU https://www.cs.cmu.edu/~16385/s17/Slides/6.2_Harris_Corner_Detector.pdf

Criterion:

&=\lambda_1 \lambda_2 - k(\lambda_1 + \lambda_2)^2 \end{align}$$ - $M$ is the [[notes/Structure Matrix|Structure Matrix]] - $\lambda_1$ and $\lambda_2$ are the two eigenvalues of the structure matrix - $k \in [0.04, 0.06]$ So in practice, we write it as a system of equations and solve for $\lambda_1$ and $\lambda_2$ $$\text{Det}(M) - k (\text{Trace}(M))^2 = \lambda_1 \lambda_2 - k(\lambda_1 + \lambda_2)^2$$ #### Solving the equation Cyrill Stachniss seems to approach this in a different way? I just reasoned this out, also getting the help of [[notes/ChatGPT|ChatGPT]]. The eigenvalues of a matrix $M$ are found by solving the characteristic equation $\text{det}(M - \lambda I) = 0$, where $I$ is the identity matrix and $\lambda$ represents the eigenvalues. For the 2x2 [[notes/Structure Matrix|Structure Matrix]] $M$ of the Harris Corner Detector, $$M = \begin{bmatrix} \sum_{\text{W}} J_x^2 & \sum_{\text{W}} J_x J_y \\ \sum_{\text{W}} J_y J_x & \sum_{\text{W}} J_y^2 \end{bmatrix}$$ Rewriting the matrix in simpler terms $$M = \begin{pmatrix} A & C \\ C & B \end{pmatrix}$$ the [[notes/Eigenvalues and Eigenvectors|Characteristic Equation]] is: $$\text{det} \left(\begin{pmatrix} A - \lambda & C \\ C & B - \lambda \end{pmatrix} \right) = (A - \lambda)(B - \lambda) - C^2 = 0$$ Simplifying, we get: $$\lambda^2 - (A+B)\lambda + (AB - C^2) = 0$$ Solving this equation gives us the solution to the [[notes/Eigenvalues and Eigenvectors|Eigenvalue]]s. Notice that $$\text{det}(M) = AB - C^2, \quad \text{trace}(M) = A+B$$ So we can simply the equation to $$\lambda^2 - (\text{trace}(M)) \lambda + \text{det}(M) = 0$$ - Plug the above in the [[notes/Quadratic Formula|Quadratic Formula]] and you will find the 2 eigenvalues > [!question] Are there always 2 eigenvalues? > > > No, a 2x2 matrix can have either one or two distinct eigenvalues. If the matrix is diagonalizable, it will typically have two distinct eigenvalues. However, it may have only one eigenvalue if it is a defective matrix. ![[attachments/Screenshot 2023-10-07 at 3.31.30 PM.png]] So how is this threshold calculated? I have 2 different eigenvalues, how do I know if $\lambda_1 >> \lambda_2$? In practice > [!question] How fast / slow does this algorithm run? > > > ? ### Related - [[notes/Shi-Tomasi Corner Detector|Shi-Tomasi Corner Detector]]