[R] R memory issue for writing out the file
mtmorgan at fhcrc.org
Tue Apr 15 19:42:43 CEST 2008
That's a big table!
You might try 'write' (you'll have to work harder to get your data into
an appropriate format).
You might also try the R-2.7 release candidate, which I think is
for the mac. There was a change in R-2.7 that will make writing large
tables without row names more efficient; this might well be where you
are running in to problems.
Xiaojing Wang wrote:
> Hello, all,
> First thanks in advance for helping me.
> I am now handling a data frame, dimension 11095400 rows and 4 columns. It
> seems work perfect in my MAC R (Mac Pro, Intel Chip with 4G RAM) until I was
> trying to write this file out using the command:
> I got the error message:
> R(319,0xa000d000) malloc: *** vm_allocate(size=88764416) failed (error
> R(319,0xa000d000) malloc: *** error: can't allocate region
> R(319,0xa000d000) malloc: *** set a breakpoint in szone_error to debug
> I then confirmed in Windows (Windows XP, 1G RAM) R by trying it again. It
> seems that it has to do with my R memory limit allocation.
> I read all the online help and still could not figure out the way to solve
> the problem. Also I do not understand why the data could be easily handled
> within R but could not write out due to the insufficient memory. I am not
> good at both R and computers. Sorry for my naive questions if it sounds
Computational Biology / Fred Hutchinson Cancer Research Center
1100 Fairview Ave. N.
PO Box 19024 Seattle, WA 98109
Location: Arnold Building M2 B169
Phone: (206) 667-2793
More information about the R-help