Andrew Gurung
  • Introduction
  • Data Science
    • Natural Language Processing
      • Sentiment analysis using Twitter
    • Linear Algebra
      • Linear algebra explained in four pages
      • Vectors
        • Vector Basics
        • Vector Projection
        • Cosine Similarity
        • Vector Norms and Orthogonality
        • Linear combination and span
        • Linear independence and Basis vectors
      • Matrices
        • Matrix Arithmetic
        • Matrix Operations
        • Functions and Linear Transformations
        • Matrix types
      • Eigendecomposition, Eigenvectors and Eigenvalues
      • Principle Component Analysis (PCA)
      • Singular-Value Decomposition(SVD)
      • Linear Algebra: Deep Learning Book
    • Calculus
      • Functions, Limits, Continuity and Differentiability
      • Scalar Derivative and Partial Derivatives
      • Gradient
      • Matrix Calculus
      • Maxima and Minima using Derivatives
      • Gradient Descent and its types
    • Statistics and Probability
      • Probability Rules and Axioms
      • Types of Events
      • Frequentist vs Bayesian View
      • Random Variables
      • MLE, MAP, and Naive Bayes
      • Probability Distributions
      • P-Value and hypothesis test
    • 7 Step DS Process
      • 1: Business Requirement
      • 2: Data Acquisition
      • 3: Data Processing
        • SQL Techniques
        • Cleaning Text Data
      • 4: Data Exploration
      • 5: Modeling
      • 6: Model deployment
      • 7: Communication
    • Miscellaneous
      • LaTeX commands
  • Computer Science
    • Primer
      • Big O Notation
  • Life
    • Health
      • Minimalist Workout Routine
      • Reddit FAQ on Nootropics
      • Hiking/Biking Resources
    • Philosophy
      • Aristotle's Defense of Private Property
    • Self-improvement
      • 100 Mental Models
      • Don't break the chain
      • Cal Newport's 5 Productivity tips
      • Andrew Ng's advice on deliberate practice
      • Atomic Habits
      • Turn sound effects off in Outlook
    • Food and Travel
      • 2019 Guide to Pesticides in Produce
      • Recipe
        • Spicy Sesame Noodles
      • Travel
        • Hiking
    • Art
      • Scott Adams: 80% of the rules of good writing
      • Learn Blues Guitar
    • Tools
      • Software
        • Docker
        • Visual Studio Code
        • Terminal
        • Comparing Git Workflow
      • Life Hacks
        • DIY Deck Cleaner
  • Knowledge Vault
    • Book
      • The Almanack of Naval Ravikant
    • Media
    • Course/Training
Powered by GitBook
On this page
  • Calculate Singular-Value Decomposition using NumPy
  • Reconstruct Matrix from SVD

Was this helpful?

  1. Data Science
  2. Linear Algebra

Singular-Value Decomposition(SVD)

Singular-Value Decomposition(SVD) is a matrix decomposition/factorization method for reducing a given matrix into its constituent elements.

Note: All matrices have an SVD, which makes it more stable than other methods, such as the eigendecomposition.

A=U.Σ.VTA = U . \Sigma.V^TA=U.Σ.VT

AAA:The real m x n matrix that we wish to decompose UUU: m x m matrix where matrix U is also known as left-singlular vectors of A Σ\SigmaΣ: m x n diagonal matrix where the diagonal values are known as singular values VTV^TVT: Transpose of an n x n matrix where matrix V is also right-singular vectors of A

Calculate Singular-Value Decomposition using NumPy

# Calculate Singular-Value Decomposition Using SciPy
from numpy import array
from numpy.linalg import svd

A = array([[1, 2], [3, 4], [5, 6]])
print('-------------------')
print('Original Matrix to be decomposed')
print(A)

U, d, VT = svd(A)
print('-------------------')
print('U Matrix')
print(U)

print('-------------------')
print('2 element Sigma vector')
print(d)

print('-------------------')
print('VT Matrix')
print(VT)
-------------------
Original Matrix to be decomposed
[[1 2]
 [3 4]
 [5 6]]
-------------------
U Matrix
[[-0.2298477   0.88346102  0.40824829]
 [-0.52474482  0.24078249 -0.81649658]
 [-0.81964194 -0.40189603  0.40824829]]
-------------------
2 element Sigma vector
[9.52551809 0.51430058]
-------------------
VT Matrix
[[-0.61962948 -0.78489445]
 [-0.78489445  0.61962948]]

Reconstruct Matrix from SVD

Note: svd() returns a sigma vector which needs to be transformed into a m x n matrix for multiplication. - Convert sigma vector into diagonal matrix n x n using diag() - Create a m x n empty/zero Sigma matrix - Populate the Sigma matrix with n x n diagonal matrix

from numpy import array
from numpy import diag
from numpy import dot
from numpy import zeros
from numpy.linalg import svd

A = array([[1, 2], [3, 4], [5, 6]])
print('-------------------')
print('Original Matrix')
print(A)

U, d, VT = svd(A)
print('-------------------')
print('2 element Sigma vector')
print(d)

# Convert Sigma vector into diagonal matrix
D = diag(d)
print('-------------------')
print('Diagonal matrix with dimension 2x2 or nxn')
print(D)

# Create an m x n Sigma matrix dimension
Sigma = zeros((A.shape[0], A.shape[1]))
print('-------------------')
print('Empty m x n Sigma matrix')
print(Sigma)

# Populate Sigma matrix with n x n diagonal matrix
Sigma[:A.shape[1], :A.shape[1]] = D
print('-------------------')
print('Sigma matrix m x n')
print(Sigma)

# Reconstruct original matrix
O = U.dot(Sigma.dot(VT))
print('-------------------')
print('Reconstructed original matrix m x n')
print(O)
-------------------
Original Matrix
[[1 2]
 [3 4]
 [5 6]]
-------------------
2 element Sigma vector
[9.52551809 0.51430058]
-------------------
Diagonal matrix with dimension 2x2 or nxn
[[9.52551809 0.        ]
 [0.         0.51430058]]
-------------------
Empty m x n Sigma matrix
[[0. 0.]
 [0. 0.]
 [0. 0.]]
-------------------
Sigma matrix m x n
[[9.52551809 0.        ]
 [0.         0.51430058]
 [0.         0.        ]]
-------------------
Reconstructed original matrix m x n
[[1. 2.]
 [3. 4.]
 [5. 6.]]

PreviousPrinciple Component Analysis (PCA)NextLinear Algebra: Deep Learning Book

Last updated 6 years ago

Was this helpful?

Link:

Gentle Introduction to Singular-Value Decomposition for Machine Learning