I'm a bit confused with PCA/LSI and SVD.
In PCA, we do Eigen Value Decomp of the "covariance matrix" of the original matrix M. M does not have to be square symmetrical. (And LSI is a special name of PCA when it is applied to documents.)
In SVD, for original matrix M, we do Eigen Value Decomp of MT M and MMT (MT means the transpose of M, I don't know how to input superscript here)respectively and using their eigen values and two sets of eigen vectors for the result of SVD. And M does not have to be square symmetrical either.
What is the difference between PCA and SVD when they are used for dimensionality reduction?
Thursday, February 1, 2007
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment