Loading [MathJax]/extensions/TeX/boldsymbol.js

 

 

 

The SVD, a Fantastic Algorithm

However, and this is the strength of the SVD algorithm, any general matrix \boldsymbol{X} can be decomposed in terms of a diagonal matrix and two orthogonal/unitary matrices. The Singular Value Decompostion (SVD) theorem states that a general m\times n matrix \boldsymbol{X} can be written in terms of a diagonal matrix \boldsymbol{\Sigma} of dimensionality m\times n and two orthognal matrices \boldsymbol{U} and \boldsymbol{V} , where the first has dimensionality m \times m and the last dimensionality n\times n . We have then

\boldsymbol{X} = \boldsymbol{U}\boldsymbol{\Sigma}\boldsymbol{V}^T

As an example, the above defective matrix can be decomposed as

\boldsymbol{X} = \frac{1}{\sqrt{2}}\begin{bmatrix} 1& 1 \\ 1& -1\\ \end{bmatrix} \begin{bmatrix} 2& 0 \\ 0& 0\\ \end{bmatrix} \frac{1}{\sqrt{2}}\begin{bmatrix} 1& -1 \\ 1& 1\\ \end{bmatrix}=\boldsymbol{U}\boldsymbol{\Sigma}\boldsymbol{V}^T,

with eigenvalues \sigma_1=2 and \sigma_2=0 . The SVD exits always!

The SVD decomposition (singular values) gives eigenvalues \sigma_i\geq\sigma_{i+1} for all i and for dimensions larger than i=p , the eigenvalues (singular values) are zero.

In the general case, where our design matrix \boldsymbol{X} has dimension n\times p , the matrix is thus decomposed into an n\times n orthogonal matrix \boldsymbol{U} , a p\times p orthogonal matrix \boldsymbol{V} and a diagonal matrix \boldsymbol{\Sigma} with r=\mathrm{min}(n,p) singular values \sigma_i\geq 0 on the main diagonal and zeros filling the rest of the matrix. There are at most p singular values assuming that n > p . In our regression examples for the nuclear masses and the equation of state this is indeed the case, while for the Ising model we have p > n . These are often cases that lead to near singular or singular matrices.

The columns of \boldsymbol{U} are called the left singular vectors while the columns of \boldsymbol{V} are the right singular vectors.