[R] a question about swap space, memory and read.table()

Hu Chen chencheva at gmail.com
Thu Dec 9 15:39:18 CET 2004

Hi all 
Two computers:
one is my desktop PC, windows2000, R 1.9.1. Physical RAM 256MB, Swap
(Virtual memory)  384Mb. When I allocate a large matrix, it firstly
uses up RAM, then use swap space. In windows' task manager, the usage
of memory could exceed my physic RAM's size.
The other machine is a remote server. Windows XP, R 1.9.1 Physical RAM 2GB.
Swap space 4GB. I use "R --max-mem-size=4000M" to start R. However
when I allocate a large matrix or data frame, it uses up all RAM then
exits with a error message" cannot allocate vector of size 7812 Kb ".
The Swap space is not used at all !
What's more, I found that the read.table() function is really a waste of memory.
> ft <- read.table("filepath")
> object.size(ft)
[1] 192000692
only 192Mb.
however, in the windows task manager it shows that this process takes
nearly 800Mb memory.
I used gc() to collect garbarge. Howerver it doesn't help.
Any guys have methods to release the wasted memory?
thank you all.

More information about the R-help mailing list