[R] R memory issue for writing out the file
jholtman at gmail.com
Tue Apr 15 21:47:38 CEST 2008
What are you going to do with the table after you write it out? Are
you just going to read it back into R? If so, have you tried using
On Tue, Apr 15, 2008 at 12:12 PM, Xiaojing Wang <timanwang at gmail.com> wrote:
> Hello, all,
> First thanks in advance for helping me.
> I am now handling a data frame, dimension 11095400 rows and 4 columns. It
> seems work perfect in my MAC R (Mac Pro, Intel Chip with 4G RAM) until I was
> trying to write this file out using the command:
> I got the error message:
> R(319,0xa000d000) malloc: *** vm_allocate(size=88764416) failed (error
> R(319,0xa000d000) malloc: *** error: can't allocate region
> R(319,0xa000d000) malloc: *** set a breakpoint in szone_error to debug
> I then confirmed in Windows (Windows XP, 1G RAM) R by trying it again. It
> seems that it has to do with my R memory limit allocation.
> I read all the online help and still could not figure out the way to solve
> the problem. Also I do not understand why the data could be easily handled
> within R but could not write out due to the insufficient memory. I am not
> good at both R and computers. Sorry for my naive questions if it sounds
> Xiaojing WANG
> Dept. of Human Genetics
> Univ. of Pittsburgh, PA 15261
> Tel: 412-624-8157
> [[alternative HTML version deleted]]
> R-help at r-project.org mailing list
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
+1 513 646 9390
What is the problem you are trying to solve?
More information about the R-help