[R] setting sensitivity of r to errors
bt_jannis at yahoo.de
Thu Mar 25 13:17:37 CET 2010
does anyone of you know how to increase Rs sensitivity to errors? I just migrated back from Matlab and really enjoyed there that Matlab just pops up with (really helpful!) error messages as soon as there is anything slightly wrong with my code. This is certainly anoying on the first run, but really helps to uncover some hidden bugs in the Code. Now I tried artificially to create errors in R to understand the try() function. I did not hardly manage to create one (surprisingly!). It would help if at least things like this would create errors:
-division by zero
-calculating mean/stdev/max etc. of arrays containing NAs
-using arrays to index which contain NAs or Infs
I tried hard:
The last 4 lines did not produce any error, just NAs or empty arrays.
Is there any way to change this?
My problem is that I am running large loops over a huge set of timeseries that are so different in size and amount of NAs, that is is hard to figure out all possible errors beforehand (If I could do so, most probably I could already publish a paper about my series straight away :-) )
Thanks for your help!
Do You Yahoo!?
Sie sind Spam leid? Yahoo! Mail verfügt über einen herausragenden Schutz gegen Massenmails.
More information about the R-help