[R] R crashes with memory errors on a 256GB machine (and system shoes only 60GB usage)
jholtman at gmail.com
Sat Jan 4 01:59:04 CET 2014
It would help to know the sizes of the objects that have in your
workspace and also provide the 10 prior lines of your script at the
point of the error so that we can see what you are trying to do. The
following commands will list out the sizes of the objects:
copy and paste it into the command line when the error occurs.
Data Munger Guru
What is the problem that you are trying to solve?
Tell me what you want to do, not how you want to do it.
On Fri, Jan 3, 2014 at 3:40 PM, Xebar Saram <zeltakc at gmail.com> wrote:
> Hi again and thank you all for the answers
> i need to add that im a relatively R neewb so i apologize in advance
> i started R with the --vanilla option and ran gc()
> this is the output i get:
> used (Mb) gc trigger (Mb) max used (Mb)
> Ncells 182236 9.8 407500 21.8 350000 18.7
> Vcells 277896 2.2 786432 6.0 785897 6.0
> also this is the memory.profile()
> NULL symbol pairlist closure environment promise
> 1 5611 86695 2277 314 4175
> language special builtin char logical integer
> 21636 44 637 6361 4574 11089
> double complex character ... any list
> 782 1 20934 0 0 8023
> expression bytecode externalptr weakref raw S4
> 1 6271 1272 364 365 831
> im running on linux (arch linux) and 'free' shows this:
> zeltak at zuni ~ ↳ free -h
> total used free shared buffers cached
> Mem: 251G 99G 152G 66G 249M 84G
> -/+ buffers/cache: 14G 237G
> Swap: 0B 0B 0B
> im not running any parrallel stuff at all
> milan: how does one know if the memory is fragmented?
> thank you all again i really appreciate the help
> On Thu, Jan 2, 2014 at 10:35 PM, Ben Bolker <bbolker at gmail.com> wrote:
>> Xebar Saram <zeltakc <at> gmail.com> writes:
>> > Hi All,
>> > I have a terrible issue i cant seem to debug which is halting my work
>> > completely. I have R 3.02 installed on a linux machine (arch
>> > which I built specifically for running high memory use models. the system
>> > is a 16 core, 256 GB RAM machine. it worked well at the start but in the
>> > recent days i keep getting errors and crashes regarding memory use, such
>> > "cannot create vector size of XXX, not enough memory" etc
>> > when looking at top (linux system monitor) i see i barley scrape the 60
>> > of ram (out of 256GB)
>> > i really don't know how to debug this and my whole work is halted due to
>> > this so any help would be greatly appreciated
>> I'm very sympathetic, but it will be almost impossible to debug
>> this sort of a problem remotely, without a reproducible example.
>> The only guess that I can make, if you *really* are running *exactly*
>> the same code as you previously ran successfully, is that you might
>> have some very large objects hidden away in a saved workspace in a
>> .RData file that's being loaded automatically ...
>> I would check whether gc(), memory.profile(), etc. give sensible results
>> in a clean R session (R --vanilla).
>> Ben Bolker
>> R-help at r-project.org mailing list
>> PLEASE do read the posting guide
>> and provide commented, minimal, self-contained, reproducible code.
> [[alternative HTML version deleted]]
> R-help at r-project.org mailing list
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
More information about the R-help