[R-SIG-Mac] memory allocation problems
Thomas Lumley
tlumley at u.washington.edu
Mon Jun 30 21:57:38 CEST 2008
It might be worth reminding people that gc() reports the maximum (R heap)
memory use as well as current memory use.
The memory profiler might also be useful for tracking down what is
happening, but I don't think it's compiled into the CRAN binary of R.
-thomas
On Mon, 30 Jun 2008, Kasper Daniel Hansen wrote:
> On Jun 30, 2008, at 12:07 PM, Antonio P. Ramos wrote:
>
>> Thanks for the comments.
>>
>> What I'm doing is very simple: I'm running an one dimensional item
>> response model, similar to the ones use in psychology and educational
>> testing data via Markov Chain Monte Carlo Methods.
>>
>>
>> model_m12<- ideal(rollcall_m2, maxiter = 500 000 000, thin = 1000,
>> burnin = 5000,
>> store.item = TRUE, normalize=T,
>> priors=list(xp=1e-12,xpv=1e-12,bp=1e-12,bpv=1e-12), verbose=T)
>>
>> # my data matrix is provided by the rollcall object, but it has just
>> 155*17 dimensions: rollcall_m2
>>
>> # the number of interactions is maxiter/thin= 500,000
>>
>> # store.item=true is the main source of the problem: it's store the
>> discrimination
>> # parameter, which consumer a large amount of memory. Unfortunately, I
>> need this information.
>>
>>
>> So, if R can access up to 3.5 Gb how can I fix the problem. I'm sure
>> lot's of mac users will be also interested in increasing its memory
>> allocation capabilities in R
>
> It works out of the box (that means you do not have to do anything). The fact
> that you get an out of memory error just means that your R session is using
> more than 3.4 GB of RAM. Think of it in the following way: say you are using
> 3.3 GB of RAM and then you create a new object with a size of 0.2GB which
> takes you over the limit. Than you would get an error of "cannot allocate 0.2
> GB of RAM". So the error you get does not mean that you cannot use more than
> 1.1 GB. It just means you went over the 3.4 GB limit.
>
> Now, from your description it sounds like you should be able to do your
> simulation within a memory usage of 3.4 GB. But whether or not that is
> possible depends crucially on the implementation which we have no idea about.
> There are many naive ways to implement this in R that will take you far
> beyond the memory limit. For example you are doing thinning. But are you for
> example first computing the MCM chain without thinning and then thinning
> afterwards - then thinning will not help you in your memory consumption.
>
> I would spend some time checking up on the implementation of the sampler you
> are using.
>
> Kasper
>
>
>
>>
>> On Mon, Jun 30, 2008 at 11:38 AM, Kasper Daniel Hansen
>> <khansen at stat.berkeley.edu> wrote:
>>> Thanks for the clarification. How did you get that output?
>>>
>>> Kasper
>>>
>>> On Jun 30, 2008, at 10:23 AM, Simon Urbanek wrote:
>>>
>>>>
>>>> On Jun 30, 2008, at 1:04 PM, Kasper Daniel Hansen wrote:
>>>>
>>>>> Like Sean is aying, you most likely are using _way_ more memory than 1.2
>>>>> GB.
>>>>>
>>>>> However, if you a re running 32bit R (which is the case if you use the
>>>>> CRAN binary) R can only access 2GB,
>>>>
>>>> That's not true, 32-bit process can use up to about 3.5GB of RAM:
>>>>
>>>> Virtual Memory Map of process 2849 (R)
>>>> Output report format: 2.2 -- 32-bit process
>>>> [...]
>>>> ReadOnly portion of Libraries: Total=72.9M resident=36.6M(50%)
>>>> swapped_out_or_unallocated=36.3M(50%)
>>>> Writable regions: Total=3.4G written=3.4G(100%) resident=3.4G(99%)
>>>> swapped_out=3352K(0%) unallocated=19.3M(1%)
>>>>
>>>> so it should make no real difference for Antonio (unless he doesn't mind
>>>> waiting while the machine swaps). Nonetheless using 64-bit R is fine as
>>>> well, especially on Leopard - albeit that doesn't fix incorrect use of
>>>> memory by users :).
>>>>
>>>> Cheers,
>>>> S
>>>>
>>>>
>>>>> so you can squeeze a little more out of your machine by switching to a
>>>>> 64bit version of R. You can check what version you have by typing
>>>>> R> .Machine
>>>>> and look for sizeof.pointer - if it is 4 you are using 32bit, if it is 8
>>>>> you are using 64 bit.
>>>>>
>>>>> If you want the 64 bit version of R you can download a binary from
>>>>> Simon's page: r.research.att.com , but you need to also get the preview
>>>>> build of GCC 4.2 which is available from Apple's developer site
>>>>> (although
>>>>> hard to find these days).
>>>>>
>>>>> Kasper
>>>>>
>>>>> On Jun 30, 2008, at 3:23 AM, Sean Davis wrote:
>>>>>
>>>>>> On Sun, Jun 29, 2008 at 6:35 AM, Antonio P. Ramos
>>>>>> <ramos.grad.student at gmail.com> wrote:
>>>>>>>
>>>>>>> Hi everybody,
>>>>>>>
>>>>>>> I have a memory allocation problem while using R in my macbook pro,
>>>>>>> which runs the latest leopard. I'm trying to run a monte carlo
>>>>>>> simulation with 500,000 interactions, but the machine failed:
>>>>>>>
>>>>>>>
>>>>>>> Starting MCMC Iterations...
>>>>>>> Error: cannot allocate vector of size 1.2 Gb
>>>>>>> R(176,0xa0640fa0) malloc: *** mmap(size=1239990272) failed (error
>>>>>>> code=12)
>>>>>>> *** error: can't allocate region
>>>>>>> *** set a breakpoint in malloc_error_break to debug
>>>>>>> R(176,0xa0640fa0) malloc: *** mmap(size=1239990272) failed (error
>>>>>>> code=12)
>>>>>>> *** error: can't allocate region
>>>>>>> *** set a breakpoint in malloc_error_break to debug
>>>>>>>
>>>>>>>
>>>>>>> Since my machine has 4 Gb of memory, and since I'm not running nothing
>>>>>>> in addition to the simulation, I found it strange. This is my machine:
>>>>>>>
>>>>>>> Model Identifier: MacBookPro3,1
>>>>>>> Processor Name: Intel Core 2 Duo
>>>>>>> Processor Speed: 2.4 GHz
>>>>>>> Memory: 4 GB
>>>>>>>
>>>>>>>
>>>>>>> Unfortunately, I could figure it out how to solve it. Any help?
>>>>>>
>>>>>> The error message above means that R failed to allocate a vector of
>>>>>> size 1.2Gb. That doesn't mean that R was using only 1.2 Gb, but that
>>>>>> it was trying to allocate a new block of memory of that size in
>>>>>> addition to the memory that was already in use. The system on the Mac
>>>>>> uses a fair amount of memory; R was probably using memory as well. In
>>>>>> short, you probably need more memory or be more clever about how you
>>>>>> are using the memory you have. Without more details about what you
>>>>>> are doing, it is difficult to know how to change the latter.
>>>>>>
>>>>>> Sean
>>>>>>
>>>>>> _______________________________________________
>>>>>> R-SIG-Mac mailing list
>>>>>> R-SIG-Mac at stat.math.ethz.ch
>>>>>> https://stat.ethz.ch/mailman/listinfo/r-sig-mac
>>>>>
>>>>> _______________________________________________
>>>>> R-SIG-Mac mailing list
>>>>> R-SIG-Mac at stat.math.ethz.ch
>>>>> https://stat.ethz.ch/mailman/listinfo/r-sig-mac
>>>>>
>>>>>
>>>
>>>
>
> _______________________________________________
> R-SIG-Mac mailing list
> R-SIG-Mac at stat.math.ethz.ch
> https://stat.ethz.ch/mailman/listinfo/r-sig-mac
Thomas Lumley Assoc. Professor, Biostatistics
tlumley at u.washington.edu University of Washington, Seattle
More information about the R-SIG-Mac
mailing list