[R] about memory

Huntsinger, Reid reid_huntsinger at merck.com
Wed Mar 30 17:23:57 CEST 2005


hclust creates a distance matrix. In your case it is 10,000 x 10,000. For
various reasons several copies are created, so you probably need at least 

100M x 8 bytes per entry x 3 copies = 2.4 GB

just for the distance matrix. If you don't have that much RAM the
computation will probably take longer than you're willing to wait.

Reid Huntsinger

-----Original Message-----
From: r-help-bounces at stat.math.ethz.ch
[mailto:r-help-bounces at stat.math.ethz.ch] On Behalf Of ronggui
Sent: Wednesday, March 30, 2005 5:37 AM
To: r-help at stat.math.ethz.ch
Subject: [R] about memory


here is my system memory:
ronggui at 0[ronggui]$ free
             total       used       free     shared    buffers     cached
Mem:        256728      79440     177288          0       2296      36136
-/+ buffers/cache:      41008     215720
Swap:       481908      60524     421384

and i want to cluster my data using hclust.my data has 3 variables and 10000
cases.but it fails and saying have not enough memory for the vector size.  I
read the help doc and use $R --max-vsize=800M to start the R 2.1.0beta under
debian linux.but it still can not get the solution.so is my pc'memory not
enough to carry this analysis or my mistake on setting the memory?

thank you.

______________________________________________
R-help at stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide!
http://www.R-project.org/posting-guide.html




More information about the R-help mailing list