[R] R memory issue for writing out the file
hb at stat.berkeley.edu
Tue Apr 15 19:45:44 CEST 2008
Try to write the data.frame to file in blocks of rows by calling
write.table() multiple times - see argument 'append' for
write.table(). That will probably require less memory.
On Tue, Apr 15, 2008 at 6:12 PM, Xiaojing Wang <timanwang at gmail.com> wrote:
> Hello, all,
> First thanks in advance for helping me.
> I am now handling a data frame, dimension 11095400 rows and 4 columns. It
> seems work perfect in my MAC R (Mac Pro, Intel Chip with 4G RAM) until I was
> trying to write this file out using the command:
> I got the error message:
> R(319,0xa000d000) malloc: *** vm_allocate(size=88764416) failed (error
> R(319,0xa000d000) malloc: *** error: can't allocate region
> R(319,0xa000d000) malloc: *** set a breakpoint in szone_error to debug
> I then confirmed in Windows (Windows XP, 1G RAM) R by trying it again. It
> seems that it has to do with my R memory limit allocation.
> I read all the online help and still could not figure out the way to solve
> the problem. Also I do not understand why the data could be easily handled
> within R but could not write out due to the insufficient memory. I am not
> good at both R and computers. Sorry for my naive questions if it sounds
> Xiaojing WANG
> Dept. of Human Genetics
> Univ. of Pittsburgh, PA 15261
> Tel: 412-624-8157
> [[alternative HTML version deleted]]
> R-help at r-project.org mailing list
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
More information about the R-help