[Rd] --max-mem-size (PR#3562)
Paul Gilbert
pgilbert at bank-banque-canada.ca
Fri Jul 25 23:15:52 MEST 2003
Melinda
Your problem does not really qualify as a bug report, or even an r-devel topic, and usually you will get much speedier responses on r-help. Memory limitations have been discussed on r-help several time in the past year. Below is some advice sent in a message about two months ago.
Paul Gilbert
Yan Yu wrote:
...
>>
>> (2) what decides the memory limit in R, How can increase that?
>
>
In recent versions of R this is controlled by the operating system, unless you
start R with an option that sets a lower limit than the OS allows. In Linux and
Unix this is controlled by
1/ ulimit (or limit in some shells). This can typically be relaxed by a normal
user to the system limits. On Linux the default max memory is usually not
limited (by ulimit) but it is sometimes necessary to relax the default stack
size.
2/ A combination of the amount of memory and amount of swap space. Roughly, on
Linux, these are added together to give the limit. This has changed over the
years in Linux and may vary on different version of Unix, but typically swap
space increases the size of problem you can handle. Physical memory is faster,
but swap works. Given prices these days you might consider having these add to
around 4G if you want to work on large problems with R.
3/ The architecture of the processor (e.g. 32-bit vs 64-bit). A program cannot
exceed the address space of the architecture (232 = 4G bytes on a typical PC
32-bit processor, 264=a lot more on a 64-bit workstation). The OS itself needs
some of this, so I believe the practical limit on 32-bit Linux is around 3G. (In
any case, there is not much point in have more than 4G of swap+memory on a
32-bit machine.) On a 64-bit architecture with a 64-bit Unix (most workstations)
the application (R) must be compiled as a 64-bit application. Thus the fixing of
Solaris bugs in gcc 3.2.3 has meant that much larger problems can now be handled
with R on Solaris (I believe this was possible before with Sun compilers.) It
should also be possible to compile 64-bit R under (64-bit) Linux on Intel
Itanium and AMD Opteron processors. I have no experience with this (but would be
interested in hearing from anyone that does).
On Windows the situation is different and I am much less familiar with it (and
look forward to being corrected). I believe applications must fit into physical
memory on Windows, that is, they can be swapped out but not partly swapped out.
This means that it is necessary to buy more memory to run bigger R problems. (Of
course, with physical memory problems will run much faster, so you should
consider buying more memory even in Unix.) Windows itself demands some of the
memory, so I believe the practical limit for applications in Windows is 2G
bytes. I understand there is a 64-bit version of Windows under development, but
I don't think it has been released yet.
Paul Gilbert
slteng at stat.berkeley.edu wrote:
>Full_Name: Melinda Teng
>Version: 1.71
>OS: Windows ME/ Unix
>Submission from: (NULL) (67.118.2.50)
>
>
>Hi,
>
>I had the following message that halted my program in both Windows ME and Unix
>system :
>
> Error: cannot allocate vector of size 781250 Kb
> In addition: Warning message:
> Reached total allocation of 247Mb: see help(memory.size)
>
>
>Action(s) taken :
>1. Use --max-mem-size (on Windows) to increase the R allocation memory to
>maximum, but I encountered the below message :
>
> > --max-mem-size
> Error in -max : Invalid argument to unary operator
>
>2. > memory.size(max=TRUE)
> [1] 25714688
> > memory.size()
> [1] 19716928
> > memory.limit(size=NA)
> [1] 259465216
> > memory.limit()
> [1] 259465216
>
>
>Would greatly appreciate any kind advice or resources.
>
>Thank you very much.
>
>______________________________________________
>R-devel at stat.math.ethz.ch mailing list
>https://www.stat.math.ethz.ch/mailman/listinfo/r-devel
>
>
More information about the R-devel
mailing list