[R] Memory usage in prcomp
Roy Mendelssohn - NOAA Federal
roy.mendelssohn at noaa.gov
Tue Mar 22 15:42:10 CET 2016
Hi All:
I am running prcomp on a very large array, roughly [500000, 3650]. The array itself is 16GB. I am running on a Unix machine and am running “top” at the same time and am quite surprised to see that the application memory usage is 76GB. I have the “tol” set very high (.8) so that it should only pull out a few components. I am surprised at this memory usage because prcomp uses the SVD if I am not mistaken, and when I take guesses at the size of the SVD matrices they shouldn’t be that large. While I can fit this in, for a variety of reasons I would like to reduce the memory footprint. She questions:
1. I am running with “center=FALSE” and “scale=TRUE”. Would I save memory if I scaled the data first myself, saved the result, cleared out the workspace, read the scaled data back in and did the prcomp call? Basically are the intermediate calculations for scaling kept in memory after use.
2. I don’t know how prcomp memory usage compares to a direct call to “svn” which allows me to explicitly set how many singular vectors to compute (I only need like the first five at most). prcomp is convenient because it does a lot of the other work for me
**********************
"The contents of this message do not reflect any position of the U.S. Government or NOAA."
**********************
Roy Mendelssohn
Supervisory Operations Research Analyst
NOAA/NMFS
Environmental Research Division
Southwest Fisheries Science Center
***Note new address and phone***
110 Shaffer Road
Santa Cruz, CA 95060
Phone: (831)-420-3666
Fax: (831) 420-3980
e-mail: Roy.Mendelssohn at noaa.gov www: http://www.pfeg.noaa.gov/
"Old age and treachery will overcome youth and skill."
"From those who have been given much, much will be expected"
"the arc of the moral universe is long, but it bends toward justice" -MLK Jr.
More information about the R-help
mailing list