[R-SIG-Finance] Realized GARCH estimation problem

Crib j@cobchr|@to||er@ @end|ng |rom gm@||@com
Sat Sep 11 16:08:20 CEST 2021


I'm trying to produce one-day ahead volatility forecasts for Bitcoin with
Realized GARCH(1,1) using the rugarch package in R. The realized variance(
data$rv5) is aggregated based on a 5 minute frequency, and the returns(
data.xts$ret) are close-to-close. Here's the specs:

rgarch.spec<- ugarchspec(mean.model = list(armaOrder= c(0,0),
                          include.mean = FALSE),
                          variance.model = list(model= 'realGARCH',
                                                garchOrder= c(1,1)),
                          distribution.model = 'norm')

 rgarchroll<- ugarchroll(spec = rgarch.spec,
                         data= data.xts$ret,
                         n.ahead = 1,
                         forecast.length = forecast_len,
                         refit.every = 5,
                         solver= 'hybrid',
                         realizedVol= data.xts$rv5,
                         VaR.alpha = c(0.01, 0.05, 0.10))

realized_vol= sqrt(tail(data.xts$rv5,forecast_len)),
rgarch.prediction_vol= rgarchroll using forecast$density$Sigma)

A plot of the results can be found here: https://i.stack.imgur.com/XAA3r.png

As you can see, the predicted volatility is consistently higher than the
realized volatility. Needless to say, the VaR predictions are not accurate
at all. However, the standard GARCH(1,1) model works fine using the same
return data. So what could possibly be the issue?

	[[alternative HTML version deleted]]



More information about the R-SIG-Finance mailing list