Andrew Gurung
  • Introduction
  • Data Science
    • Natural Language Processing
      • Sentiment analysis using Twitter
    • Linear Algebra
      • Linear algebra explained in four pages
      • Vectors
        • Vector Basics
        • Vector Projection
        • Cosine Similarity
        • Vector Norms and Orthogonality
        • Linear combination and span
        • Linear independence and Basis vectors
      • Matrices
        • Matrix Arithmetic
        • Matrix Operations
        • Functions and Linear Transformations
        • Matrix types
      • Eigendecomposition, Eigenvectors and Eigenvalues
      • Principle Component Analysis (PCA)
      • Singular-Value Decomposition(SVD)
      • Linear Algebra: Deep Learning Book
    • Calculus
      • Functions, Limits, Continuity and Differentiability
      • Scalar Derivative and Partial Derivatives
      • Gradient
      • Matrix Calculus
      • Maxima and Minima using Derivatives
      • Gradient Descent and its types
    • Statistics and Probability
      • Probability Rules and Axioms
      • Types of Events
      • Frequentist vs Bayesian View
      • Random Variables
      • MLE, MAP, and Naive Bayes
      • Probability Distributions
      • P-Value and hypothesis test
    • 7 Step DS Process
      • 1: Business Requirement
      • 2: Data Acquisition
      • 3: Data Processing
        • SQL Techniques
        • Cleaning Text Data
      • 4: Data Exploration
      • 5: Modeling
      • 6: Model deployment
      • 7: Communication
    • Miscellaneous
      • LaTeX commands
  • Computer Science
    • Primer
      • Big O Notation
  • Life
    • Health
      • Minimalist Workout Routine
      • Reddit FAQ on Nootropics
      • Hiking/Biking Resources
    • Philosophy
      • Aristotle's Defense of Private Property
    • Self-improvement
      • 100 Mental Models
      • Don't break the chain
      • Cal Newport's 5 Productivity tips
      • Andrew Ng's advice on deliberate practice
      • Atomic Habits
      • Turn sound effects off in Outlook
    • Food and Travel
      • 2019 Guide to Pesticides in Produce
      • Recipe
        • Spicy Sesame Noodles
      • Travel
        • Hiking
    • Art
      • Scott Adams: 80% of the rules of good writing
      • Learn Blues Guitar
    • Tools
      • Software
        • Docker
        • Visual Studio Code
        • Terminal
        • Comparing Git Workflow
      • Life Hacks
        • DIY Deck Cleaner
  • Knowledge Vault
    • Book
      • The Almanack of Naval Ravikant
    • Media
    • Course/Training
Powered by GitBook
On this page
  • Eigendecomposition
  • Eigenvectors and Eigenvalues
  • Positive and negative definite matrix
  • Calculate an eigendecomposition with NumPy
  • Confirm a vector is an eigenvector
  • Reconstruct a matrix from eigenvectors and eigenvalues

Was this helpful?

  1. Data Science
  2. Linear Algebra

Eigendecomposition, Eigenvectors and Eigenvalues

PreviousMatrix typesNextPrinciple Component Analysis (PCA)

Last updated 6 years ago

Was this helpful?

Eigendecomposition

Eigendecomposition(pronounced eye-gan) of a matrix is a type of decomposition that involves decomposing a square matrix into a set of eigenvectors and eigenvalues.

Note: Decomposition does NOT result in a compression of the matrix; instead, it breaks it down into constituent parts to make certain operations on the matrix easier to perform.

Eigenvectors and Eigenvalues

An eigenvector is a vector whose direction remains unchanged when a linear transformation is applied to it. The factor by which an eigenvector is stretched or squished after a linear transformation is known as eigenvalue.

The eigenvalue lambda tells whether the special vector x is stretched or shrunk or reversed(negative value) or left unchanged.

Eigenvector of a matrix must satisfy the following equation:

Positive and negative definite matrix

A matrix that has only positive eigenvalues is referred to as a positive definite matrix, whereas if the eigenvalues are all negative, it is referred to as a negative definite matrix.

Calculate an eigendecomposition with NumPy

from numpy import array
from numpy.linalg import eig

A = array([[1, 2, 3], [4, 5, 6], [7, 8, 9]])
eigenvalues, eigenvectors = eig(A)

print(A)
print(eigenvalues)
print(eigenvectors)
[[1 2 3]
 [4 5 6]
 [7 8 9]]
 
[ 1.61168440e+01 -1.11684397e+00 -9.75918483e-16]

[[-0.23197069 -0.78583024  0.40824829]
 [-0.52532209 -0.08675134 -0.81649658]
 [-0.8186735   0.61232756  0.40824829]]

Confirm a vector is an eigenvector

from numpy import array
from numpy.linalg import eig
from numpy import dot

A = array([[1, 2, 3], [4, 5, 6], [7, 8, 9]])
eigenvalues, eigenvectors = eig(A)

# Verify eigenvector equation for the first eigenvalue and eigenvector
firstEigenValue = eigenvalues[0]
firstEigenVector = eigenvectors[:, 0]

# Eigenvector equation
# A . v = lambda . v
lhs = dot(A, firstEigenVector)
rhs = firstEigenValue * firstEigenVector

print(lhs)
print(rhs)​
[ -3.73863537  -8.46653421 -13.19443305]
[ -3.73863537  -8.46653421 -13.19443305]

Reconstruct a matrix from eigenvectors and eigenvalues

from numpy import array
from numpy.linalg import eig
from numpy import dot
from numpy.linalg import inv
from numpy import diag

A = array([[1, 2, 3], [4, 5, 6], [7, 8, 9]])
eigenvalues, eigenvectors = eig(A)​

# A = Q . diag(eigenvalues) . Q^-1
Q = eigenvectors
diagEigenVals = diag(eigenvalues)
invQ = inv(Q)

compose = Q.dot(diagEigenVals).dot(invQ)
print(A)
print(compose)​
[[1 2 3]
 [4 5 6]
 [7 8 9]]
[[1. 2. 3.]
 [4. 5. 6.]
 [7. 8. 9.]]

Link:

AAA : Parent square matrix QQQ : Matrix comprised of the eigenvectors Λ\LambdaΛ: Diagonal matrix comprised of the eigenvalues Q−1Q^{-1}Q−1: Inverse of the matrix comprised of the eigenvectors

A.v⃗=λ.v⃗Matrix−vectormultiplicationVsScalar−vectormultiplicationConvertscalarλtoMatrix:λ.I(IdentityMatrix)A.v⃗=(λI).v⃗A.v⃗−(λI).v⃗=0⃗(A−λI).v⃗=0⃗A . \vec{v} = \lambda . \vec{v} \newline Matrix-vector\hspace{0.1cm}multiplication\hspace{0.2cm}Vs\hspace{0.2cm}Scalar-vector\hspace{0.1cm}multiplication \newline Convert\hspace{0.1cm}scalar \lambda\hspace{0.1cm}to\hspace{0.1cm}Matrix\hspace{0.1cm}: \lambda.I(Identity Matrix) \newline A . \vec{v} = (\lambda I) . \vec{v}\newline A . \vec{v} - (\lambda I) . \vec{v} = \vec{0} \newline (A - \lambda I) . \vec{v} = \vec{0}A.v=λ.vMatrix−vectormultiplicationVsScalar−vectormultiplicationConvertscalarλtoMatrix:λ.I(IdentityMatrix)A.v=(λI).vA.v−(λI).v=0(A−λI).v=0

AAA : Parent square matrix v⃗\vec{v}v : Eigenvector of the matrix λ\lambdaλ: Scalar eigenvalue

For eigenvector v⃗\vec{v}v to be non-zero, the only possible solution where a matrix(AAA-λ\lambdaλIII) and vector (v⃗\vec{v}v) multiplication that results in zero is during squishification. i.e det(matrix) = 0 or det(A−λI)=0det(A - \lambda I) = 0det(A−λI)=0

A.v⃗=λ.v⃗A . \vec{v} = \lambda . \vec{v} A.v=λ.v

Video: 3Blue1Brown
Video: What are Eigenvalues and Eigenvectors?
Computer Vision for dummies: Eigenvalues and Eigenvectors
Machine Learning Mastery
Eigenvectors (red) do not change direction when a linear transformation is applied to them