[R-sig-ME] Tweedie GLMM with "cplm" package

Pratyaydipta Rudra pratyayr at gmail.com
Fri Apr 22 07:49:48 CEST 2016


I am trying to use the "cplm" package in R to fit compound Poisson
generalized linear mixed effects model. Although the package is quite
powerful, it seems to have some bugs and memory issues. I am using the
function "cpglmm" to fit a model with log-link which has a fixed intercept,
and a group specific random intercept. My code is like

fit=cpglmm(y ~ 1 + (1|s))

A common error that I get (I have tried it in a windows machine and a mac)
is like:

> Calloc' could not allocate memory (18446744071562067968 of 8 bytes)

It seems to me that there is some memory leak. Unfortunately this error is
not reproducible and the model fits okay in an immediate second attempt.

A second error that I get can be reproduced. Please find the R object
"bug1.RObj" from the link below:

When I run the following, I get different estimates for the same model (the
model corresponding to dat[2, ] ). If I fit it again and again, it gives me
the same result as the second fit. But if I fit it immediately after the
first model, it gives me different result.

tryfit=cpglmm(as.numeric(dat[1,]) ~ 1 + (1|s))
tryfit=cpglmm(as.numeric(dat[2,]) ~ 1 + (1|s))
tryfit=cpglmm(as.numeric(dat[2,]) ~ 1 + (1|s))

There were many other instances with similar code where it seems like the
package is "carrying" something from the previous fit. For example, a data
fits perfectly fine when fitted on its own, but when fitted after another
particular fit, my computer gets stuck and never recovers until I close the
R session.

The only way I could resolve this issue for all such cases was to detach
the package after each fit and re-attach it, which I do not want to do.

A different error from the function "cpglm" when I try to fit a fixed
effects model with just an intercept:

> Error: cannot allocate vector of size 5.6 Gb
   In addition: Warning messages:
   1: In dtweedie.logw.smallp(y = y, power = power, phi = phi) :
   Reached total allocation of 16275Mb: see help(memory.size)

I don't get why R should attempt to allocate such huge memory to fit a data
with ~180 observations.

I tried to read through the functions to catch these errors, but could not
figure out.

Any help will be appreciated.


*Pratyaydipta Rudra*
Post Doctoral fellow
Department of Biostatistics and Informatics
Colorado School of Public Health
University of Colorado Anschutz Medical Campus

	[[alternative HTML version deleted]]

More information about the R-sig-mixed-models mailing list