[R] help track a segmentation fault
Omar Lakkis
uofiowa at gmail.com
Wed May 4 18:51:55 CEST 2005
I have an R script that I run using
nohup R CMD BATCH r.in r.out &
The code loops through data from the database and takes hours. The
problem is, in about an hour and a half after I start the script the
program stops and I get
/usr/lib/R/bin/BATCH: line 55: 14067 Done ( echo
"invisible(options(echo = TRUE))"; cat ${in}; echo "proc.time()" )
14068 Segmentation fault | ${R_HOME}/bin/R ${opts} >${out} 2>&1
in the nohup.out file.
If I run the code from within R, not using CMD BATCH, R sig faults
after the hour and a half (roughly).
I monitored the process using "top" and found nothing usual, the mem
utilization is around 15% and CPU time in the 90s%. I do not see a
steady increase in mem usage signaling memory leak.
I have a core dump file generated but I do not know what to do with it.
Can someone, please, suggest to me what I can do to track this
problem, probably using the -d flag with R, which I do not know how to
use.
I am running R 2.1.0 on debian.
More information about the R-help
mailing list