[R] sem problem - did not converge

Jeremy Miles jeremy.miles at gmail.com
Mon Feb 14 20:15:57 CET 2011


You have a fairly large and complex model there.  This sort of model
(almost) always causes problems.

I would try fitting one factor at a time.  That might help you to
narrow down the problem.  If one factor doesn't converge, the whole
model won't converge.

You might also consider joining the structural equation modeling list
- semnet.  This isn't really a sem (the package) or R problem, it's a
more general SEM (the approach) problem.

Jeremy






Chen, F., Bollen, K., Paxton, P., Curran, P.J., & Kirby, J. (2001).
Improper solutions in structural equation models: Causes,
consequences, and strategies. Sociological Methods and Research, 29,
468-508.

On 14 February 2011 07:34, Felipe Bhering <felipelbhering at gmail.com> wrote:
>
> Someone can help me? I tried several things and always don't converge
>
> # Model
> library(sem)
> dados40.cov <- cov(dados40,method="spearman")
> model.dados40 <- specify.model()
> F1 ->  Item11, lam11, NA
> F1 ->  Item31, lam31, NA
> F1 ->  Item36, lam36, NA
> F1 ->  Item54, lam54, NA
> F1 ->  Item63, lam63, NA
> F1 ->  Item65, lam55, NA
> F1 ->  Item67, lam67, NA
> F1 ->  Item69, lam69, NA
> F1 ->  Item73, lam73, NA
> F1 ->  Item75, lam75, NA
> F1 ->  Item76, lam76, NA
> F1 ->  Item78, lam78, NA
> F1 ->  Item79, lam79, NA
> F1 ->  Item80, lam80, NA
> F1 ->  Item83, lam83, NA
> F2 ->  Item12, lam12, NA
> F2 ->  Item32, lam32, NA
> F2 ->  Item42, lam42, NA
> F2 ->  Item47, lam47, NA
> F2 ->  Item64, lam64, NA
> F2 ->  Item66, lam66, NA
> F2 ->  Item68, lam68, NA
> F2 ->  Item74, lam74, NA
> F3 ->  Item3, lam3, NA
> F3 ->  Item8, lam8, NA
> F3 ->  Item18, lam18, NA
> F3 ->  Item23, lam23, NA
> F3 ->  Item28, lam28, NA
> F3 ->  Item33, lam33, NA
> F3 ->  Item38, lam38, NA
> F3 ->  Item43, lam43, NA
> F4 ->  Item9, lam9, NA
> F4 ->  Item39, lam39, NA
> F5 ->  Item5, lam5, NA
> F5 ->  Item10, lam10, NA
> F5 ->  Item20, lam20, NA
> F5 ->  Item25, lam25, NA
> F5 ->  Item30, lam30, NA
> F5 ->  Item35, lam35, NA
> F5 ->  Item45, lam45, NA
> Item3 <-> Item3, e3,   NA
> Item5 <-> Item5, e5,   NA
> Item8 <-> Item8, e8,   NA
> Item9 <-> Item9, e9,   NA
> Item10 <-> Item10, e10,   NA
> Item11 <-> Item11, e11,   NA
> Item12 <-> Item12, e12,   NA
> Item18 <-> Item18, e18,   NA
> Item20 <-> Item20, e20,   NA
> Item23 <-> Item23, e23,   NA
> Item25 <-> Item25, e25,   NA
> Item28 <-> Item28, e28,   NA
> Item30 <-> Item30, e30,   NA
> Item31 <-> Item31, e31,   NA
> Item32 <-> Item32, e32,   NA
> Item33 <-> Item33, e33,   NA
> Item35 <-> Item35, e35,   NA
> Item36 <-> Item36, e36,   NA
> Item38 <-> Item38, e38,   NA
> Item39 <-> Item39, e39,   NA
> Item42 <-> Item42, e42,   NA
> Item43 <-> Item43, e43,   NA
> Item45 <-> Item45, e45,   NA
> Item47 <-> Item47, e47,   NA
> Item54 <-> Item54, e54,   NA
> Item63 <-> Item63, e63,   NA
> Item64 <-> Item64, e64,   NA
> Item65 <-> Item65, e65,   NA
> Item66 <-> Item66, e66,   NA
> Item67 <-> Item67, e67,   NA
> Item68 <-> Item68, e68,   NA
> Item69 <-> Item69, e69,   NA
> Item73 <-> Item73, e73,   NA
> Item74 <-> Item74, e74,   NA
> Item75 <-> Item75, e75,   NA
> Item76 <-> Item76, e76,   NA
> Item78 <-> Item78, e78,   NA
> Item79 <-> Item79, e79,   NA
> Item80 <-> Item80, e80,   NA
> Item83 <-> Item83, e83,   NA
> F1 <-> F1, NA,    1
> F2 <-> F2, NA,    1
> F3 <-> F3, NA,    1
> F4 <-> F4, NA,    1
> F5 <-> F5, NA,    1
> F1 <-> F2, F1F2, NA
> F1 <-> F3, F1F3, NA
> F1 <-> F4, F1F4, NA
> F1 <-> F5, F1F5, NA
> F2 <-> F3, F2F3, NA
> F2 <-> F4, F2F4, NA
> F2 <-> F5, F2F5, NA
> F3 <-> F4, F3F4, NA
> F3 <-> F5, F3F5, NA
> F4 <-> F5, F4F5, NA
>
>
> ###i tryed several correlations, such as hetcor and polychor of polycor
> library
>
>
> hcor <- function(data) hetcor(data, std.err=FALSE)$correlations
> hetdados40=hcor(dados40)
>
>
> dados40.sem <- sem(model.dados40, dados40.cov, nrow(dados40))
> Warning message:
> In sem.default(ram = ram, S = S, N = N, param.names = pars, var.names =
> vars,  :
>  Could not compute QR decomposition of Hessian.
> Optimization probably did not converge.
>
> #####################################################
>
> The same happen if i put hetdados40 in the place of dados40.cov
> of course hetdados40 has 1 in the diag, but any 0
>
> what should i do? i tryed several things...
>
> all value positive..
>
> #####################################################
>
>> eigen(hetdados40)$values
>  [1] 14.7231030  4.3807378  1.6271780  1.4000193  1.0670784  1.0217670
>  [7]  0.8792466  0.8103790  0.7397817  0.7279262  0.6909955  0.6589746
> [13]  0.6237204  0.6055884  0.5777750  0.5712017  0.5469284  0.5215437
> [19]  0.5073809  0.4892339  0.4644124  0.4485545  0.4372404  0.4290573
> [25]  0.4270672  0.4071262  0.3947753  0.3763811  0.3680527  0.3560231
> [31]  0.3537934  0.3402836  0.3108977  0.3099143  0.2819351  0.2645035
> [37]  0.2548654  0.2077900  0.2043732  0.1923942
>> eigen(dados40.cov)$values
>  [1] 884020.98 337855.95 138823.30 126291.58  87915.21  79207.04  73442.71
>  [8]  68388.11  60625.26  58356.54  55934.05  54024.00  50505.10  48680.26
> [15]  46836.47  45151.23  43213.65  41465.42  40449.59  37824.73  37622.43
> [22]  36344.34  35794.22  33959.29  33552.64  32189.94  31304.44  30594.85
> [29]  30077.32  29362.66  26928.12  26526.72  26046.47  24264.50  23213.18
> [36]  21503.97  20312.55  18710.97  17093.24  14372.21
>
> #####################################################
>
>
> there are 40 variables and 1004 subjects, should not be a problem the number
> of variables also!
> --
> View this message in context: http://r.789695.n4.nabble.com/sem-problem-did-not-converge-tp3305200p3305200.html
> Sent from the R help mailing list archive at Nabble.com.
>
> ______________________________________________
> R-help at r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
>



-- 
Jeremy Miles
Psychology Research Methods Wiki: www.researchmethodsinpsychology.com



More information about the R-help mailing list