Loading [MathJax]/extensions/TeX/boldsymbol.js

 

 

 

And finally \boldsymbol{X}\boldsymbol{X}^T

For \boldsymbol{X}\boldsymbol{X}^T we found

\boldsymbol{X}\boldsymbol{X}^T=\boldsymbol{U}\boldsymbol{\Sigma}\boldsymbol{V}^T\boldsymbol{V}\boldsymbol{\Sigma}^T\boldsymbol{U}^T=\boldsymbol{U}\boldsymbol{\Sigma}^T\boldsymbol{\Sigma}\boldsymbol{U}^T.

Since the matrices here have dimension n\times n , we have

\boldsymbol{\Sigma}\boldsymbol{\Sigma}^T = \begin{bmatrix} \tilde{\boldsymbol{\Sigma}} \\ \boldsymbol{0}\\ \end{bmatrix}\begin{bmatrix} \tilde{\boldsymbol{\Sigma}} \boldsymbol{0}\\ \end{bmatrix}=\begin{bmatrix} \tilde{\boldsymbol{\Sigma}} & \boldsymbol{0} \\ \boldsymbol{0} & \boldsymbol{0}\\ \end{bmatrix},

leading to

\boldsymbol{X}\boldsymbol{X}^T=\boldsymbol{U}\begin{bmatrix} \tilde{\boldsymbol{\Sigma}} & \boldsymbol{0} \\ \boldsymbol{0} & \boldsymbol{0}\\ \end{bmatrix}\boldsymbol{U}^T.

Multiplying with \boldsymbol{U} from the right gives us the eigenvalue problem

(\boldsymbol{X}\boldsymbol{X}^T)\boldsymbol{U}=\boldsymbol{U}\begin{bmatrix} \tilde{\boldsymbol{\Sigma}} & \boldsymbol{0} \\ \boldsymbol{0} & \boldsymbol{0}\\ \end{bmatrix}.

It means that the eigenvalues of \boldsymbol{X}\boldsymbol{X}^T are again given by the non-zero singular values plus now a series of zeros. The column vectors of \boldsymbol{U} are the eigenvectors of \boldsymbol{X}\boldsymbol{X}^T and measure how much correlations are contained in the rows of \boldsymbol{X} .

Since we will mainly be interested in the correlations among the features of our data (the columns of \boldsymbol{X} , the quantity of interest for us are the non-zero singular values and the column vectors of \boldsymbol{V} .