[R] Problem with memory consuming algorithm
jholtman at gmail.com
Sat Sep 17 03:01:34 CEST 2011
The first thing I would do, especially before letting it run for 48
hours, is to take a small subset of the data and run it with the
profiler (Rprof) enable to see where time is being spent. I would
also put some print statements in the main loop to periodically output
the amount of CPU and memory being used. It is hard to look through
the code without a subset of the data. So run the profiler and tell
us what the output looks like. You may be calling a function that is
not part of your code that is taking all the time, but until you run
the profiler, it is hard to tell.
On Fri, Sep 16, 2011 at 4:45 PM, Simon Zehnder <szehnder at uni-bonn.de> wrote:
> Hi guyz,
> I have serious problems with an algorithm I let run on a supercomputer:
> You find the functions under the following URLs:
> simuFunctionCaller: http://pastebin.com/6gw2fJFb
> calls Function simuFunctionNBM (http://pastebin.com/QeJDUnqx)
> after reading a csv-file ordered like the following:
> Reading and running without an exception is no problem. But the algorithm needs hours to run and seems to use more memory than available:
> I have given a time limit of 48 h and a memory limit of 128 GB RAM on the supercomputer (it is a cluster, a BATCH System)…..nevertheless the computer shuts down the algorithm after a certain time because of memory problems.
> Does anyone of u guys see a problem in the algorithm? Especially with so much RAM and time available?
> I am very thankful for your suggestions!
> R-help at r-project.org mailing list
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
Data Munger Guru
What is the problem that you are trying to solve?
More information about the R-help