[R-SIG-Finance] Problems when estimating GARCH parameters with fGarch

Curtis Miller cgmil at msn.com
Thu Nov 2 20:56:55 CET 2017


Hello all,

I have encountered bad behavior in fGarch's garchFit() function used for 
estimating the parameters of a GARCH model. The estimates behave in 
highly erratic ways on simulated data. For example, when beta = 0.2 
according to the simulation, the function sometimes estimates beta to be 
0.0000001 even for sample sizes as large as 1000, and there are other 
irregularities. I believe this behavior is tied to how the numerical 
optimizers are computing the parameters.

In my research I planned on using garchFit() from fGarch in a 
changepoint detection context. I was hoping to use it to detect 
structural change in GARCH parameters. (See, for example, Ling 2007 
paper https://arxiv.org/abs/0708.2369 .) But with this behavior I don't 
know if such a test using garchFit() is possible; the estimates are too 
unreliable.

Has anyone else observed this behavior? Is there a way to get around it? 
I'm hoping someone who knows more about this can offer guidance.

I have written a blog post documenting the behavior I observed, with 
numerical experiments. Here is a link: 
https://ntguardian.wordpress.com/2017/11/02/problems-estimating-garch-parameters-r/


Curtis



More information about the R-SIG-Finance mailing list