[R] a question about swap space, memory and read.table()

Marcus Davy MDavy at hortresearch.co.nz
Thu Dec 9 22:00:14 CET 2004

On a 32-bit windows (standard install or R) machine you cannot allocate more than 2Gigs
of memory,
"R --max-mem-size=2G"
"R --max-mem-size=2000M"

If you try to allocate more than 2G (eg 4000 Meg) I suspect it then defaults back to 1Gig. Check you memory limit with memory.limit(). There is lots of information about configuration in the windows FAQ, 

If you want 3Gigs per process you are going to have to modify the R executable to make it 


>>> Hu Chen <chencheva at gmail.com> 10/12/2004 3:39:18 AM >>>
Hi all 
Two computers:
one is my desktop PC, windows2000, R 1.9.1. Physical RAM 256MB, Swap
(Virtual memory)  384Mb. When I allocate a large matrix, it firstly
uses up RAM, then use swap space. In windows' task manager, the usage
of memory could exceed my physic RAM's size.
The other machine is a remote server. Windows XP, R 1.9.1 Physical RAM 2GB.
Swap space 4GB. I use "R --max-mem-size=4000M" to start R. However
when I allocate a large matrix or data frame, it uses up all RAM then
exits with a error message" cannot allocate vector of size 7812 Kb ".
The Swap space is not used at all !
What's more, I found that the read.table() function is really a waste of memory.
> ft <- read.table("filepath")
> object.size(ft)
[1] 192000692
only 192Mb.
however, in the windows task manager it shows that this process takes
nearly 800Mb memory.
I used gc() to collect garbarge. Howerver it doesn't help.
Any guys have methods to release the wasted memory?
thank you all.

R-help at stat.math.ethz.ch mailing list
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


The contents of this e-mail are privileged and/or confidenti...{{dropped}}

More information about the R-help mailing list