[R] Memory problem:Failing to increase dynamic memory

Prof Brian Ripley ripley at stats.ox.ac.uk
Tue Jul 1 18:15:22 CEST 2003

On Tue, 1 Jul 2003, Tapan Mehta wrote:

> Hello,
> I am trying to use R1.7 on Linux. I having some
> problem with memory. The task has to handle 100 files
> of 10MB (each file is .CEL file) and is related to
> microarrays. I am trying to run this task on a 2GB or
> a 4 GB node of a Bewoulf Linux based cluster. However
> I am getting this error saying 'array size of 336000KB
> cannot be intialized'.
>  I have seen the help of R for memory but I have not
> been able to increase the dynamic memory. I used to
> use the memory.limit(size = ) function in the older
> version of R(1.6.1)

That function only exists on Windows under R, so you are not comparing 
like with like.  The Linux port has not preset memory limit.  You probably 
are just running out of memory and need to reorganize your computations.

> .In the latest version however I
> am unable to achieve the task by using the function
> mem.limits(). It would be nice if somebody could help
> me out on this.

Brian D. Ripley,                  ripley at stats.ox.ac.uk
Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
University of Oxford,             Tel:  +44 1865 272861 (self)
1 South Parks Road,                     +44 1865 272866 (PA)
Oxford OX1 3TG, UK                Fax:  +44 1865 272595

More information about the R-help mailing list