[R] maximizing available memory under windows XP

Prof Brian Ripley ripley at stats.ox.ac.uk
Thu Jan 26 16:41:12 CET 2006


The problem is your test.  You are trying to re-allocate large objects, 
and memory fragmentation will take its toll, and be almost random in its 
effects.

Using the CRAN binary of 2.2.1 and

> for(i in 1:1000) assign(paste("r", i, sep="."), rnorm(1e6))
> gc()
             used   (Mb) gc trigger   (Mb)  max used   (Mb)
Ncells    170480    4.6     350000    9.4    350000    9.4
Vcells 253062963 1930.8  253513169 1934.2 253063937 1930.8
> memory.size(max=T)
[1] 2037137408

I was able to get close to address space limit on my laptop (which does 
not have the /3GB switch set (*in boot.ini*, not in the binary)).

And that binary is marked:

[d:/R/R-2.2.1/bin]% dumpbin /HEADERS Rterm.exe
Microsoft (R) COFF Binary File Dumper Version 6.00.8447
Copyright (C) Microsoft Corp 1992-1998. All rights reserved.


Dump of file Rterm.exe

PE signature found

File Type: EXECUTABLE IMAGE

FILE HEADER VALUES
              14C machine (i386)
                6 number of sections
         43A80482 time date stamp Tue Dec 20 13:17:54 2005
                0 file pointer to symbol table
                0 number of symbols
               E0 size of optional header
              32F characteristics
                    Relocations stripped
                    Executable
                    Line numbers stripped
                    Symbols stripped
                    Application can handle large (>2GB) addresses
                    32 bit word machine
                    Debug information stripped


On Thu, 26 Jan 2006, roger bos wrote:

> I have always been using ebitbin to set the 3GB switch in the windows
> binary, but version 2.2.1 has this set as default (which I verified using
> dumpbin).  However, when I generate junk data to fill up my memory and read
> the memory usage using gc(), it seems that I am not getting as good results
> with 2.2.1 patched as I was with 2.2.0 after I edited the header.  Under R
> 2.2.0 I was able to use over 2GB and with R 2.2.1 patched I can access only
> 1GB.  Anyone have any suggestions that I can try.  My machine has 4GB and I
> have modified the Boot.ini file.
>
> Thanks,
>
> Roger
>
> Here is the gc() on 2.2.1 patched:
>> gc()
>           used  (Mb) gc trigger   (Mb)  max used   (Mb)
> Ncells   252021   6.8     467875   12.5    379294   10.2
> Vcells 71097226 542.5  140857919 1074.7 140597245 1072.7
>>
>
> Here is the gc() output on 2.2.0 after I edited the header:
>> gc()
>            used  (Mb) gc trigger   (Mb)  max used   (Mb)
> Ncells    174118   4.7     350000    9.4    350000    9.4
> Vcells 130065529 992.4  257820332 1967.1 257565551 1965.1
>>
>
> Here is the code that I used to fill my memory (nothing fancy):
> a <- diag(1000)
> b <-a
> for (i in 1:1000000) {
> a<- diag(1000)
> b <- rbind(b,a)
> }
> gc()
>
> 	[[alternative HTML version deleted]]
>
> ______________________________________________
> R-help at stat.math.ethz.ch mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html
>

-- 
Brian D. Ripley,                  ripley at stats.ox.ac.uk
Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
University of Oxford,             Tel:  +44 1865 272861 (self)
1 South Parks Road,                     +44 1865 272866 (PA)
Oxford OX1 3TG, UK                Fax:  +44 1865 272595




More information about the R-help mailing list