In

linear algebra, the

singular value decomposition (

SVD) is a very useful

mathematical tool in analyzing the properties of a

matrix.

Fundamental to the idea of the SVD is the theorem which states that *every* matrix (say the *m*×*n* matrix *A*) can be expressed as a product of three matrices:

*A* = *U*Σ*V*^{T}

Here,

*U* is an

*m*×

*m* orthogonal matrix, Σ is an

*m*×

*n* diagonal matrix, and

*V* is an

*n*×

*n* orthogonal matrix (the 'T' superscript denotes the

transpose, as usual). This decomposition

*A* into

*U*, Σ, and

*V* is called the

*singluar value decomposition* of

*A*.

The diagonal values of Σ are called the *singular values* of *A*. If we define *p* = min(*m*, *n*) we can call them σ_{1}, σ_{2}, ..., σ_{p} from the top left to the bottom right of Σ. They are always ordered from largest to smallest. If *r* is the rank of *A*, then there are exactly *r* non-zero singular values in the SVD for *A*. In addition, the value of the 2-norm of *A* is equal to σ_{1}.

The columns of *V* -- designated **v**_{1}, **v**_{2}, ..., **v**_{n} -- are called the *right singular vectors* of *A*. Similarly, the columns of *U* -- designated **u**_{1}, **u**_{2}, ..., **u**_{m} -- are called the *left singular vectors* of *A*. It turns out that the vectors **v**_{r+1}, **v**_{r+2}, ..., **v**_{n} is a basis for the null space of *A*. The vectors **u**_{1}, **u**_{2}, ..., **u**_{r} is a basis for the range of *A*

The SVD has the following geometric interpretation. Recall that an orthogonal matrix is either a rotation matrix or a reflection matrix. Also, a diagonal matrix serves as a non-uniform scale and -- if the matrix is non-square -- a change in the number of dimensions. Now, if we multiply *A* from the right by a column vector **x**, by the SVD theorem we have that

*A***x** = *U*Σ*V*^{T}**x** =
*U*(Σ(*V*^{T}**x**)).

This means that multiplication of a

vector by

*any*^{1} matrix

*A* is equivalent to the following

sequence of

transformations on the vector:

- Rotation or reflection around the origin
- Non-uniform scale around the origin and possible change in the number of dimensions
- Rotation or reflection around the origin (not necessarily the same as in step 1)

This also means that the

unit sphere in

*n* dimensions will be transformed by

*A* into an

ellipse in

*m* (or less) dimensions.
In this sense, the left singular vectors represent the

axes of the ellipse (major and minor axes if in 2D) in the

range of

*A*. The right singular vectors represent the unit vectors which map to the left singluar vectors.

The singular value decomposition is not unique. One source of ambiguity lies in the fact that the singular vectors may be flipped in right-left pairs without changing *A*. A second source arises from the fact the axes of the ellipse are undefined if the ellipse is in fact a circle. To see this, think of the SVD for the identity matrix.

A useful application of the SVD is to define a matrix called the pseudoinverse of *A*. You can also use it to find a least-squares solution to a homogeneous system of linear equations.

^{1} With compatible dimensions, of course.