In linear algebra, the singular value decomposition (SVD) is a factorization of a real or complex matrix. It is the generalization of the eigendecomposition of a positive semidefinite normal matrix (for example, a symmetric matrix with non-negative eigenvalues) to any

m
×
n

{\displaystyle m\times n}
matrix via an extension of the polar decomposition. It has many useful applications in signal processing and statistics.
Formally, the singular value decomposition of an

m
×
n

{\displaystyle m\times n}
real or complex matrix

M

{\displaystyle \mathbf {M} }
is a factorization of the form

U
Σ

V

{\displaystyle \mathbf {U\Sigma V^{*}} }
, where

U

{\displaystyle \mathbf {U} }
is an

m
×
m

{\displaystyle m\times m}
real or complex unitary matrix,

Σ

{\displaystyle \mathbf {\Sigma } }
is an

m
×
n

{\displaystyle m\times n}
rectangular diagonal matrix with non-negative real numbers on the diagonal, and

V

{\displaystyle \mathbf {V} }
is an

n
×
n

{\displaystyle n\times n}
real or complex unitary matrix. The diagonal entries

σ

i

{\displaystyle \sigma _{i}}
of

Σ

{\displaystyle \mathbf {\Sigma } }
are known as the singular values of

M

{\displaystyle \mathbf {M} }
. The columns of

U

{\displaystyle \mathbf {U} }
and the columns of

V

{\displaystyle \mathbf {V} }
are called the left-singular vectors and right-singular vectors of

M

{\displaystyle \mathbf {M} }
, respectively.
The singular value decomposition can be computed using the following observations:

The left-singular vectors of M are a set of orthonormal eigenvectors of MM∗.
The right-singular vectors of M are a set of orthonormal eigenvectors of M∗M.
The non-negative singular values of M (found on the diagonal entries of Σ) are the square roots of the non-negative eigenvalues of both M∗M and MM∗.Applications that employ the SVD include computing the pseudoinverse, least squares fitting of data, multivariable control, matrix approximation, and determining the rank, range, and null space of a matrix.

View More On Wikipedia.org