[R] problems with memory allocation
thomas at biostat.washington.edu
Thu Oct 21 17:32:24 CEST 1999
On Thu, 21 Oct 1999, Manuel wrote:
> I hope that someone has had a similar trouble and will be able to
> help us :
> We , have installed the R package in a Digital Workstation with 500Mb
> RAM memory, running under Unix operating system. The package works fine
> but when we try to start the program with more than 120Mb, (vsize
> - --120M) the
> workstation refuses to allocate this memory. The message that we get
> Fatal error: Could not allocate memory for vector heap.
> Someone told us that the solution was an appropiate ulimit call, but
> when we do ulimit -a we get only a number 1048576. We figure out that
> this number
> can be the data segment size.
> When we do
> ulimit -d unlimited
> ulimit -s unlimited
> ulimit -m unlimited
> ulimit -v unlimited
> we get the following mesage:
> Requested ulimit exceed hard limit. We think that this mean that we have
> no limit to
> the amount of memory that can be allocated.
> We have installed the same version of the program under linux (Redhat
> 6.0) and we
> were also unable to allocate more than 120 M.
> I would be very grateful if someone could let us some new advise in
> order to solve the
> Note: I don`t know if the R package is able to allocate more than 120M.
> We need about 250 M of memory because currently we are dealing with
> problems in hight dimension.
I have no problem allocating --vsize 250M using R0.65.1 on either Debian
GNU/Linux or Solaris 2.7. In fact, I can allocate --vsize 1000M under
Solaris, which is substantially larger than physical memory
wompom% ~/Rarchive/R --vsize 1000M
Ncells 128269 250000
Vcells 131024950 131072000
Assistant Professor, Biostatistics
University of Washington, Seattle
r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !) To: r-help-request at stat.math.ethz.ch
More information about the R-help