Eigendecomposition(pronounced eye-gan) of a matrix is a type of decomposition that involves decomposing a square matrix into a set of eigenvectors and eigenvalues.
Note: Decomposition does NOT result in a compression of the matrix; instead, it breaks it down into constituent parts to make certain operations on the matrix easier to perform.
A : Parent square matrix
Q : Matrix comprised of the eigenvectors
Λ: Diagonal matrix comprised of the eigenvalues
Q−1: Inverse of the matrix comprised of the eigenvectors
Eigenvectors and Eigenvalues
An eigenvector is a vector whose direction remains unchanged when a linear transformation is applied to it. The factor by which an eigenvector is stretched or squished after a linear transformation is known as eigenvalue.
The eigenvalue lambda tells whether the special vector x is stretched or shrunk or reversed(negative value) or left unchanged.
Eigenvector of a matrix must satisfy the following equation:
A : Parent square matrix
v : Eigenvector of the matrix
λ: Scalar eigenvalue
For eigenvector v to be non-zero, the only possible solution where a matrix(A-λI) and vector (v) multiplication that results in zero is during squishification. i.e det(matrix) = 0 or det(A−λI)=0
Positive and negative definite matrix
A matrix that has only positive eigenvalues is referred to as a positive definite matrix, whereas if the eigenvalues are all negative, it is referred to as a negative definite matrix.
Calculate an eigendecomposition with NumPy
from numpy import arrayfrom numpy.linalg import eigA =array([[1, 2, 3], [4, 5, 6], [7, 8, 9]])eigenvalues, eigenvectors =eig(A)print(A)print(eigenvalues)print(eigenvectors)