[R] cannot allocate vector of size... restructuring suggestion please...
tsunhin wong
thjwong at gmail.com
Mon Dec 15 19:12:14 CET 2008
Dear R Users,
I was running some data analysis scripts and ran into this error:
Error: cannot allocate vector of size 27.6 Mb
Doing a "memory.size(max=TRUE)" will give me:
[1] 1506.812
The current situation is:
I'm working on a Windows Vista 32bit laptop with 4GB RAM (effectively
3GB I assume...)
I have a data file of 450Mb loaded into R and have around 1500
data.frames floating in the global space as my data source.
The way I run this analysis:
I call a patch processing & procedure script
>> it retrieves 4 lists of info (each around 400x100) from an index data.frame, and then it calls another script to retrieve info from the corresponding data.frames on the 4 lists in the global space
>> through calling another script, about 1000x3 will be retrieved by another script
>> the 1000x3 will be passed to a third script expanded to 20001x3, and only 20001x1 will be used
>> 20001x1 will accumulate into a matrix of up to 20001x1500 (number of data frames / trials), say I have to divide the trials into 2 groups and do a comparison, then that's processing of 2 matrices of size 20001x750
But the allocation error stopped the script after script has processed
around 280 data frames, i.e. made the first matrix up to 20001x280...
I know running the analysis should possibly be achieved by
restructuring my script a little bit, but I have no idea where to
start with to try...
Also, I have no idea about Garbage Collection ability or memory
recycle / reuse ability in R and I think some memory may have been
lost in the middle of the process, and it may be possible to put them
back to the system for R to make use of...
Please advise me to let me to find out the most efficient way of
eliminating the error...
Thanks so much!
Regards,
John
More information about the R-help
mailing list