Eigendecomposition
Eigendecomposition is a method to decompose a matrix into a set of Eigenvalues and Eigenvectors.
Remember that if is an eigenvector of , then
Suppose matrix has linearly independent eigenvectors with corresponding eigenvalues .
We can construct the matrix with one eigenvector per column: .
The eigendecomposition is given by
Derivation
When does this eigendecomposition exist? And why does it work? Let’s first form the matrix . Notice that by definition of eigenvalues and eigenvectors, You will have: So multiplying by the inverse, we get:
If is a real symmetric matrix, then we can decompose it into where
- is an orthogonal matrix composed of eigenvectors of
- is a diagonal matrix composed of the eigenvalues
Two questions:
- Why does this always work? For symmetric matrices, the eigenvectors are guaranteed to be orthogonal. So there always exists a eigendecomposition.
- Why can we just take the transpose instead of the inverse?? Because is an Orthogonal Matrix, so by definition!
Note that eigendecomposition is not guaranteed to be unique.
The matrix is Singular if and only if any of the eigenvalues are zero.
Eigendecomposition only works on square matrices
Because eigenvalues are only defined for square matrices.
The eigendecomposition of a real symmetric matrix can also be used to optimize quadratic expressions of the form subject to .
Related
- SVD (which generalizes the eigendecomposition)