[R] Cannot allocate memory of size x on Linux - what's the solution?

davew0000 davejwood at gmail.com
Tue Sep 29 10:53:47 CEST 2009


Hi all,

I'm running an analysis with the random forest tool. It's being applied to a
data matrix of ~60,000 rows and between about 40 and 200 columns. I get the
same error with all of the data files (Cannot allocate vector of size
428.5MB). 

I found dozens of threads regarding this problem, but they never seem to be
concluded. Usually the OP is directed to the memory allocation help file
(which I haven't understood the solution for linux), and the last post is
the OP saying they haven't sorted out their problem yet. 

I'm running on a LINUX machine wtih 64GB RAM, so it's not a problem with
lack of system resources. 

Can anyone tell me how I can get R to allocate larger vectors on Linux? 

Many thanks,

Dave
-- 
View this message in context: http://www.nabble.com/Cannot-allocate-memory-of-size-x-on-Linux---what%27s-the-solution--tp25659271p25659271.html
Sent from the R help mailing list archive at Nabble.com.




More information about the R-help mailing list