[R] R memory usage
Jun Ding
dingjun_cn at yahoo.com
Thu Aug 9 03:08:27 CEST 2007
Hi All,
I have two questions in terms of the memory usage in R
(sorry if the questions are naive, I am not familiar
with this at all).
1) I am running R in a linux cluster. By reading the R
helps, it seems there are no default upper limits for
vsize or nsize. Is this right? Is there an upper limit
for whole memory usage? How can I know the default in
my specific linux environment? And can I increase the
default?
2) I use R to read in several big files (~200Mb each),
and then I run:
gc()
I get:
used (Mb) gc trigger (Mb) max used
Ncells 23083130 616.4 51411332 1372.9 51411332
Vcells 106644603 813.7 240815267 1837.3 227550003
(Mb)
1372.9
1736.1
What do columns of "used", "gc trigger" and "max used"
mean? It seems to me I have used 616Mb of Ncells and
813.7Mb of Vcells. Comparing with the numbers of "max
used", I still should have enough memory. But when I
try
object.size(area.results) ## area.results is a big
data.frame
I get an error message:
Error: cannot allocate vector of size 32768 Kb
Why is that? Looks like I am running out of memory. Is
there a way to solve this problem?
Thank you very much!
Jun
____________________________________________________________________________________
More information about the R-help
mailing list