Eigendecomposition, Eigenvectors and Eigenvalues
Last updated
Last updated
Eigendecomposition(pronounced eye-gan) of a matrix is a type of decomposition that involves decomposing a square matrix into a set of eigenvectors and eigenvalues.
Note: Decomposition does NOT result in a compression of the matrix; instead, it breaks it down into constituent parts to make certain operations on the matrix easier to perform.
An eigenvector is a vector whose direction remains unchanged when a linear transformation is applied to it. The factor by which an eigenvector is stretched or squished after a linear transformation is known as eigenvalue.
The eigenvalue lambda tells whether the special vector x is stretched or shrunk or reversed(negative value) or left unchanged.
Eigenvector of a matrix must satisfy the following equation:
A matrix that has only positive eigenvalues is referred to as a positive definite matrix, whereas if the eigenvalues are all negative, it is referred to as a negative definite matrix.
Link:
: Parent square matrix : Matrix comprised of the eigenvectors : Diagonal matrix comprised of the eigenvalues : Inverse of the matrix comprised of the eigenvectors
: Parent square matrix : Eigenvector of the matrix : Scalar eigenvalue
For eigenvector to be non-zero, the only possible solution where a matrix(-) and vector () multiplication that results in zero is during squishification. i.e det(matrix) = 0 or