Open In App

Singular Value Decomposition (SVD)

The Singular Value Decomposition (SVD) of a matrix is a factorization of that matrix into three matrices. It has some interesting algebraic properties and conveys important geometrical and theoretical insights about linear transformations. It also has some important applications in data science. In this article, I will try to explain the mathematical intuition behind SVD and its geometrical meaning. 

Mathematics behind SVD:



The SVD of  mxn matrix A is given by the formula  

where:



Examples

so our singular values are: 

For 

which can be row-reduces to :

A unit vector in the direction of it is:

Similarly, for \lambda = 9, the eigenvector is:

For the 3rd eigenvector, we could use the property that it is perpendicular to v1 and v2 such that:

Solving the above equation to generate the third eigenvector

Now, we calculate U using the formula u_i = \frac{1}{\sigma} A v_i and this gives U =. Hence, our final SVD equation becomes:

Applications

Suppose, we need to calculate the pseudo-inverse of a matrix M:
Then, the SVD of M can be given as:
Multiply both sides by M^{-1}.[Tex]I= M^{-1} U W V^{T}[/Tex]Multiply both side by V:[Tex]V = M^{-1} U W[/Tex]Multiply by W^{-1}Since the W is the singular matrix, the inverse of W  is [Tex]V W^{-1} = M^{-1} U W W^{-1}[/Tex]Multiply by [Tex]V W^{-1} U^{T} = M^{-1} U U^{T}[/Tex]

The above equation gives the pseudo-inverse.

Solving a set of Homogeneous Linear Equation (Mx =b): if b=0,  calculate SVD and take any column of VT associated with a singular value (in W) equal to 0.

If Multiply by [Tex]M^{-1} M x = M^{-1} b [/Tex]

From the Pseudo-inverse, we know that 

Hence, 

Implementation:

In this code, we will try to calculate the Singular value decomposition using Numpy and Scipy.  We will be calculating SVD, and also performing pseudo-inverse. In the end, we can apply SVD for compressing the image

# Imports
 
from skimage.color import rgb2gray
from skimage import data
import matplotlib.pyplot as plt
import numpy as np
from scipy.linalg import svd
 
"""
Singular Value Decomposition
"""
# define a matrix
X = np.array([[3, 3, 2], [2, 3, -2]])
print(X)
# perform SVD
U, singular, V_transpose = svd(X)
# print different components
print("U: ", U)
print("Singular array", singular)
print("V^{T}", V_transpose)
 
"""
Calculate Pseudo inverse
"""
# inverse of singular matrix is just the reciprocal of each element
singular_inv = 1.0 / singular
# create m x n matrix of zeroes and put singular values in it
s_inv = np.zeros(X.shape)
s_inv[0][0] = singular_inv[0]
s_inv[1][1] = singular_inv[1]
# calculate pseudoinverse
M = np.dot(np.dot(V_transpose.T, s_inv.T), U.T)
print(M)
 
"""
SVD on image compression
"""
 
 
cat = data.chelsea()
plt.imshow(cat)
# convert to grayscale
gray_cat = rgb2gray(cat)
 
# calculate the SVD and plot the image
U, S, V_T = svd(gray_cat, full_matrices=False)
S = np.diag(S)
fig, ax = plt.subplots(5, 2, figsize=(8, 20))
 
curr_fig = 0
for r in [5, 10, 70, 100, 200]:
    cat_approx = U[:, :r] @ S[0:r, :r] @ V_T[:r, :]
    ax[curr_fig][0].imshow(cat_approx, cmap='gray')
    ax[curr_fig][0].set_title("k = "+str(r))
    ax[curr_fig, 0].axis('off')
    ax[curr_fig][1].set_title("Original Image")
    ax[curr_fig][1].imshow(gray_cat, cmap='gray')
    ax[curr_fig, 1].axis('off')
    curr_fig += 1
plt.show()

                    

Output: 

[[ 3  3  2]
[ 2 3 -2]]
---------------------------
U: [[-0.7815437 -0.6238505]
[-0.6238505 0.7815437]]
---------------------------
Singular array [5.54801894 2.86696457]
---------------------------
V^{T} [[-0.64749817 -0.7599438 -0.05684667]
[-0.10759258 0.16501062 -0.9804057 ]
[-0.75443354 0.62869461 0.18860838]]
--------------------------
# Inverse
array([[ 0.11462451, 0.04347826],
[ 0.07114625, 0.13043478],
[ 0.22134387, -0.26086957]])
---------------------------

Original vs SVD k-image


Article Tags :