[R] ?summaryRprof running at 100% cpu for one hour ...

Mike Marchywka marchywka at hotmail.com
Sun Nov 21 03:12:10 CET 2010









> Date: Sat, 20 Nov 2010 21:30:38 -0300
> From: kjetilbrinchmannhalvorsen at gmail.com
> To: ligges at statistik.tu-dortmund.de
> CC: r-help at r-project.org
> Subject: Re: [R] ?summaryRprof running at 100% cpu for one hour ...
>
> see below.
>
> 2010/11/20 Uwe Ligges :
> >
> >
> > On 19.11.2010 21:43, Kjetil Halvorsen wrote:
> >>
> >> This is very strange. (Debian squeeze, R 2.12.0 compiled from source)
> >>
> >> I did some moderately large computation (including svd of a 560x50
> >> matrix),
> >> running a few minutes, and R memory increasing to about 900MB on this
> >> 2 GB ram laptop. I had done Rprof(memory.profiling=TRUE) first.
> >> Then doing summaryRprof().
> >> Then doing
> >> ?summaryRprof
> >> and then the computer running with one of two cores at 100% for more
> >> than an hour!
> >>
> >> Whats happening?
> >
> > We do not know. What about sending a reproducible example?
>
> I will try. But how do I send this info when I have to kill
> the R-process from outside?
>

Can you run it in gdb? Just break a few times and see if stack
trace is informative. Usually in a tight loop you only need
sample a few times to find the offender. 
 
The question still remains if you are using the other tools to 
isolate some issues. Usually once memory is getting tight, you
end up doing VM. I'm not entirely sure what you are doing here-
you recognize that memory is limiting and want to see what objects
are using it which is obviously a good approach. However, once you 
instrument things it tends to disrort and slow things down.
This is especially true of less gross memory profiling such
as issues with low level cache hits ( probably not relevant here
but something to know about). 
Something like gdb, or sampling an unmodified program, may be
more informative depending on exactly how the R memory profiling
is implemented
 
I haven't proffed in linux lately or much at all but in 'dohs the
task manager CPU usage drops when you start page faulting. It is
possible much of speed issue is due to that and anything that
adds to memory usage could really slow things down. 
 
 
 
> kjetil
>
> >
> > Best,
> > Uwe
> >
> >
> >> (running R from within emacs-ess)
> >> Kjetil
> >>
> >> ______________________________________________
> >> R-help at r-project.org mailing list
> >> https://stat.ethz.ch/mailman/listinfo/r-help
> >> PLEASE do read the posting guide
> >> http://www.R-project.org/posting-guide.html
> >> and provide commented, minimal, self-contained, reproducible code.
> >
>
> ______________________________________________
> R-help at r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code. 		 	   		  


More information about the R-help mailing list