[R] SVD of a variance matrix

Giovanni Petris GPetris at uark.edu
Tue Apr 15 23:43:07 CEST 2008


I suppose this is more a matrix theory question than a question on R,
but I will give it a try...

I am using La.svd to compute the singular value decomposition (SVD) of
a variance matrix, i.e., a symmetric nonnegative definite square
matrix. Let S be my variance matrix, and S = U D V' be its SVD. In my
numerical experiments I always got U = V. Is this necessarily the
case? Or I might eventually run into a SVD which has U != V?

Thank you in advance for your insights and pointers. 



Giovanni Petris  <GPetris at uark.edu>
Associate Professor
Department of Mathematical Sciences
University of Arkansas - Fayetteville, AR 72701
Ph: (479) 575-6324, 575-8630 (fax)

More information about the R-help mailing list