[R] Troubleshooting underidentification issues in structural equation modelling (SEM)

Bert Gunter gunter.berton at gene.com
Fri Feb 15 11:11:01 CET 2013


These are statistical, not R issues, so please do not post further
here. You are clearly out of your depth statistically. You need to get
local statistical help, or you can try posting on a statistical list
like stats.stackexchange.com if you care to take advice from unknown
sources who don't understand the details of your situation.

-- Bert

On Fri, Feb 15, 2013 at 1:11 AM, Ruijie <breakaway8 at gmail.com> wrote:
> Thanks Prof Fox for your guidance. My purpose in fitting this model is to
> contrast it with another model that I am proposing which I believe will be
> a better fit.
>
> On the point of some of the items being close to invariant, I had a close
> look at my data and indeed that is the case I am aware of it. However, I am
> not sure what to do with these items. Do I remove them? If I do, what
> threshold of variance do I set for removal? How do I decide on that
> threshold?
>
> I've combed a number of textbooks for answers but sadly have not found
> much. Hope you could offer some advice, thanks!
>
> Regards,
> Ruijie (RJ)
>
> --------
> He who has a why can endure any how.
>
> ~ Friedrich Nietzsche
>
>
> On 10 February 2013 00:38, John Fox <jfox at mcmaster.ca> wrote:
>
>> Dear Ruijie,
>>
>> Your model is underidentified by virtue of two of the factors having only
>> one observed indicator each. No SEM software can magically estimate this
>> model as it stands. Beyond that, I won't comment on the wisdom of what
>> you're doing, such as computing covariances between ordinal variables --
>> but
>> see what I discovered below.
>>
>> Removing these two variables and the associated factors produces the
>> following model:
>>
>> --------- snip ------------
>>
>> > model <- cfa(reference.indicators=FALSE)
>> 1: F01: I01, I02, I03
>> 2: F02: I04, I05, I06, I07, I08, I09, I10, I11, I12, I13
>> 3: F03: I14, I15, I16, I17, I18, I19, I20, I21, I22, I23, I24, I25, I26
>> 4: F04: I27, I28, I29, I30, I31, I32, I33, I34
>> 5: F05: I35, I36, I37, I38, I39, I40, I41, I42, I43
>> 6: F07: I46, I47, I48, I49, I50, I51
>> 7: F08: I54, I55, I56, I57, I58, I59, I60, I61, I62, I63, I64
>> 8: F09: I65, I66, I67
>> 9: F11: I69, I70, I71
>> 10:
>> Read 9 items
>> NOTE: adding 66 variances to the model
>> >
>> > cfa.output <- sem(model, cov.mat, N = 900)
>>
>> --------- snip ------------
>>
>> sem() ran out of iterations, but the summary output is revealing:
>>
>> --------- snip ------------
>>
>> > summary(cfa.output)
>>
>>  Model Chisquare =  5677.1   Df =  2043 Pr(>Chisq) = 0
>>  AIC =  6013.1
>>  BIC =  -8220.193
>>
>>  Normalized Residuals
>>    Min. 1st Qu.  Median    Mean 3rd Qu.    Max.
>> -3.9910 -0.5887 -0.1486  0.2588  0.8092 17.2900
>>
>>  R-square for Endogenous Variables
>>     I01     I02     I03     I04     I05     I06     I07     I08     I09
>> I10
>>  0.0953  0.1263  0.0000  0.1131  0.4039  0.2519  0.1168  0.0468  0.0005
>> 0.0059
>>     I11     I12     I13     I14     I15     I16     I17     I18     I19
>> I20
>>  0.0479  0.0228  0.1150  0.2813  0.0001  0.0388  0.2106  0.0001  0.0913
>> 0.0063
>>     I21     I22     I23     I24     I25     I26     I27     I28     I29
>> I30
>>  0.0041  0.0077  0.0022  0.0000  0.0299  0.0067  0.0019  0.0011  0.0010
>> 0.0000
>>     I31     I32     I33     I34     I35     I36     I37     I38     I39
>> I40
>>  0.0005  0.0117  0.0270  0.0001  0.0084  0.0001  0.0256  0.4969  0.0613
>> 0.0515
>>     I41     I42     I43     I46     I47     I48     I49     I50     I51
>> I54
>>  0.0005  0.0052  0.0307  0.0003  0.1131  0.0014  0.0000  0.1276  0.9728
>> 0.0520
>>     I55     I56     I57     I58     I59     I60     I61     I62     I63
>> I64
>>  0.2930  0.0127  0.0543  0.0500  0.0378  0.0001  0.3048  0.0002  0.0304
>> 0.0001
>>     I65     I66     I67     I69     I70     I71
>> 56.7264  0.0000  0.0002  0.2220  0.2342  0.2240
>>
>>  Parameter Estimates
>>              Estimate      Std Error    z value      Pr(>|z|)
>>
>> lam[I01:F01]  3.023074e-02 5.133785e-03  5.888586224  3.895133e-09 I01 <---
>> F01
>> lam[I02:F01]  3.283192e-02 5.291069e-03  6.205157975  5.464199e-10 I02 <---
>> F01
>> lam[I03:F01]  1.123398e-04 2.695713e-03  0.041673509  9.667590e-01 I03 <---
>> F01
>> lam[I04:F02]  1.365329e-01 1.555023e-02  8.780124358  1.632940e-18 I04 <---
>> F02
>> lam[I05:F02]  9.525580e-02 5.517838e-03 17.263245517  8.896692e-67 I05 <---
>> F02
>> lam[I06:F02]  1.720147e-01 1.277593e-02 13.463962882  2.548717e-41 I06 <---
>> F02
>> lam[I07:F02]  3.164280e-02 3.543421e-03  8.930015663  4.259485e-19 I07 <---
>> F02
>> lam[I08:F02]  5.685988e-02 1.021854e-02  5.564386503  2.630763e-08 I08 <---
>> F02
>> lam[I09:F02]  1.234516e-03 2.228298e-03  0.554017268  5.795670e-01 I09 <---
>> F02
>> lam[I10:F02]  1.656005e-02 8.458411e-03  1.957820181  5.025112e-02 I10 <---
>> F02
>> lam[I11:F02]  8.785114e-02 1.560646e-02  5.629151062  1.810987e-08 I11 <---
>> F02
>> lam[I12:F02]  3.022114e-02 7.815459e-03  3.866842129  1.102537e-04 I12 <---
>> F02
>> lam[I13:F02]  5.075487e-02 5.732307e-03  8.854177302  8.430329e-19 I13 <---
>> F02
>> lam[I14:F03]  2.587670e-01 2.308125e-02 11.211137448  3.595430e-29 I14 <---
>> F03
>> lam[I15:F03] -2.999816e-04 1.469667e-03 -0.204115351  8.382634e-01 I15 <---
>> F03
>> lam[I16:F03]  2.314973e-02 5.256310e-03  4.404179628  1.061849e-05 I16 <---
>> F03
>> lam[I17:F03]  9.333201e-02 9.301123e-03 10.034488472  1.075152e-23 I17 <---
>> F03
>> lam[I18:F03] -3.389770e-04 1.469665e-03 -0.230649144  8.175874e-01 I18 <---
>> F03
>> lam[I19:F03]  6.783532e-02 1.005099e-02  6.749117110  1.487475e-11 I19 <---
>> F03
>> lam[I20:F03]  3.916003e-02 2.208166e-02  1.773418523  7.615938e-02 I20 <---
>> F03
>> lam[I21:F03]  7.260062e-03 5.059696e-03  1.434881038  1.513210e-01 I21 <---
>> F03
>> lam[I22:F03]  4.556262e-02 2.322628e-02  1.961683814  4.979931e-02 I22 <---
>> F03
>> lam[I23:F03]  1.528270e-03 1.469492e-03  1.039998378  2.983407e-01 I23 <---
>> F03
>> lam[I24:F03] -8.635421e-04 7.794243e-03 -0.110792296  9.117811e-01 I24 <---
>> F03
>> lam[I25:F03]  3.625777e-02 9.391320e-03  3.860774500  1.130282e-04 I25 <---
>> F03
>> lam[I26:F03]  2.350350e-02 1.287924e-02  1.824913234  6.801412e-02 I26 <---
>> F03
>> lam[I27:F04]  8.013741e-03 7.100286e-03  1.128650332  2.590454e-01 I27 <---
>> F04
>> lam[I28:F04]  1.094008e-03 1.051268e-03  1.040655898  2.980353e-01 I28 <---
>> F04
>> lam[I29:F04]  3.712052e-03 3.647614e-03  1.017665748  3.088368e-01 I29 <---
>> F04
>> lam[I30:F04]  2.309796e-04 3.735193e-03  0.061838730  9.506913e-01 I30 <---
>> F04
>> lam[I31:F04]  9.905663e-03 1.152962e-02  0.859149344  3.902581e-01 I31 <---
>> F04
>> lam[I32:F04]  2.612580e-02 2.019934e-02  1.293398622  1.958732e-01 I32 <---
>> F04
>> lam[I33:F04]  8.299228e-02 6.192966e-02  1.340105491  1.802111e-01 I33 <---
>> F04
>> lam[I34:F04] -1.131056e-03 2.529220e-03 -0.447195412  6.547340e-01 I34 <---
>> F04
>> lam[I35:F05]  7.917586e-03 3.671643e-03  2.156414987  3.105128e-02 I35 <---
>> F05
>> lam[I36:F05] -1.122579e-03 6.021404e-03 -0.186431415  8.521065e-01 I36 <---
>> F05
>> lam[I37:F05]  5.245211e-03 1.392977e-03  3.765467592  1.662377e-04 I37 <---
>> F05
>> lam[I38:F05]  1.459603e-01 1.212396e-02 12.038999880  2.216262e-33 I38 <---
>> F05
>> lam[I39:F05]  9.091376e-02 1.563821e-02  5.813567281  6.115538e-09 I39 <---
>> F05
>> lam[I40:F05]  1.174920e-01 2.202669e-02  5.334074682  9.603300e-08 I40 <---
>> F05
>> lam[I41:F05] -6.674451e-03 1.240103e-02 -0.538217344  5.904270e-01 I41 <---
>> F05
>> lam[I42:F05]  2.074782e-02 1.220154e-02  1.700426338  8.905076e-02 I42 <---
>> F05
>> lam[I43:F05]  2.058762e-02 4.991076e-03  4.124885623  3.709190e-05 I43 <---
>> F05
>> lam[I46:F07] -7.270739e-03 1.477067e-02 -0.492241486  6.225486e-01 I46 <---
>> F07
>> lam[I47:F07]  3.294388e-02 3.596677e-03  9.159533769  5.212202e-20 I47 <---
>> F07
>> lam[I48:F07]  1.960841e-02 1.764661e-02  1.111171519  2.664945e-01 I48 <---
>> F07
>> lam[I49:F07] -3.231036e-06 1.918097e-03 -0.001684501  9.986560e-01 I49 <---
>> F07
>> lam[I50:F07]  3.300839e-02 3.426575e-03  9.633058172  5.797778e-22 I50 <---
>> F07
>> lam[I51:F07]  3.234144e-02 1.806978e-03 17.898079438  1.220591e-71 I51 <---
>> F07
>> lam[I54:F08]  1.003417e-01 1.711888e-02  5.861462155  4.588091e-09 I54 <---
>> F08
>> lam[I55:F08]  1.408049e-01 9.886797e-03 14.241707324  5.047855e-46 I55 <---
>> F08
>> lam[I56:F08]  4.096655e-02 1.425085e-02  2.874673321  4.044457e-03 I56 <---
>> F08
>> lam[I57:F08]  7.137153e-02 1.191379e-02  5.990663872  2.089862e-09 I57 <---
>> F08
>> lam[I58:F08]  1.206947e-01 2.100849e-02  5.745043255  9.189749e-09 I58 <---
>> F08
>> lam[I59:F08]  7.178104e-02 1.439758e-02  4.985632949  6.175929e-07 I59 <---
>> F08
>> lam[I60:F08]  2.027172e-03 6.627611e-03  0.305867676  7.597054e-01 I60 <---
>> F08
>> lam[I61:F08]  1.215272e-01 8.374503e-03 14.511567971  1.023539e-47 I61 <---
>> F08
>> lam[I62:F08]  1.072324e-03 3.404172e-03  0.315002895  7.527595e-01 I62 <---
>> F08
>> lam[I63:F08]  4.836428e-02 1.084696e-02  4.458785647  8.242530e-06 I63 <---
>> F08
>> lam[I64:F08] -7.221766e-04 2.879830e-03 -0.250770557  8.019915e-01 I64 <---
>> F08
>> lam[I65:F09]  3.983293e+00 9.711381e+01  0.041016748  9.672825e-01 I65 <---
>> F09
>> lam[I66:F09] -1.673556e-03 4.096286e-02 -0.040855450  9.674111e-01 I66 <---
>> F09
>> lam[I67:F09]  5.049621e-04 1.235197e-02  0.040881113  9.673907e-01 I67 <---
>> F09
>> lam[I69:F11]  1.586150e-01 1.373361e-02 11.549406592  7.433188e-31 I69 <---
>> F11
>> lam[I70:F11]  8.237619e-02 6.956861e-03 11.840999012  2.395820e-32 I70 <---
>> F11
>> lam[I71:F11]  9.448552e-02 8.147082e-03 11.597468367  4.244491e-31 I71 <---
>> F11
>> C[F01,F02]    3.728217e-02 9.597514e-02  0.388456537  6.976782e-01 F02 <-->
>> F01
>> C[F01,F03]    7.240582e-01 1.355959e-01  5.339824854  9.303642e-08 F03 <-->
>> F01
>> C[F01,F04]   -5.354253e-01 5.303413e-01 -1.009586227  3.126936e-01 F04 <-->
>> F01
>> C[F01,F05]    2.384885e-01 1.052432e-01  2.266070269  2.344708e-02 F05 <-->
>> F01
>> C[F01,F07]    1.040182e+00 1.489435e-01  6.983736644  2.874306e-12 F07 <-->
>> F01
>> C[F01,F08]   -1.013298e-01 1.035977e-01 -0.978107752  3.280210e-01 F08 <-->
>> F01
>> C[F01,F09]    1.171918e-02 2.860487e-01  0.040969189  9.673205e-01 F09 <-->
>> F01
>> C[F01,F11]    7.946394e-02 1.093765e-01  0.726517178  4.675218e-01 F11 <-->
>> F01
>> C[F02,F03]    2.272594e-01 6.201036e-02  3.664862498  2.474715e-04 F03 <-->
>> F02
>> C[F02,F04]    1.730434e-01 2.421846e-01  0.714510214  4.749117e-01 F04 <-->
>> F02
>> C[F02,F05]    5.724325e-02 5.826660e-02  0.982436740  3.258847e-01 F05 <-->
>> F02
>> C[F02,F07]    6.462176e-02 4.345441e-02  1.487116261  1.369841e-01 F07 <-->
>> F02
>> C[F02,F08]    9.751552e-01 4.152782e-02 23.481976829 6.233472e-122 F08 <-->
>> F02
>> C[F02,F09]   -6.044195e-04 1.578879e-02 -0.038281562  9.694632e-01 F09 <-->
>> F02
>> C[F02,F11]    1.026869e-01 6.243113e-02  1.644803751  1.000103e-01 F11 <-->
>> F02
>> C[F03,F04]    7.503546e-01 5.859127e-01  1.280659345  2.003133e-01 F04 <-->
>> F03
>> C[F03,F05]    2.162240e-01 6.673622e-02  3.239980149  1.195380e-03 F05 <-->
>> F03
>> C[F03,F07]    3.686512e-01 5.011777e-02  7.355697641  1.899325e-13 F07 <-->
>> F03
>> C[F03,F08]    2.308590e-01 6.677771e-02  3.457127167  5.459671e-04 F08 <-->
>> F03
>> C[F03,F09]    3.422314e-02 8.348605e-01  0.040992640  9.673018e-01 F09 <-->
>> F03
>> C[F03,F11]    2.699455e-01 7.051428e-02  3.828238253  1.290638e-04 F11 <-->
>> F03
>> C[F04,F05]    1.062305e+00 7.911158e-01  1.342793467  1.793389e-01 F05 <-->
>> F04
>> C[F04,F07]   -8.324317e-02 1.748320e-01 -0.476132285  6.339801e-01 F07 <-->
>> F04
>> C[F04,F08]    1.389356e-01 2.448826e-01  0.567356043  5.704723e-01 F08 <-->
>> F04
>> C[F04,F09]    5.856590e-02 1.429422e+00  0.040971726  9.673184e-01 F09 <-->
>> F04
>> C[F04,F11]    2.294948e+00 1.661805e+00  1.380997204  1.672798e-01 F11 <-->
>> F04
>> C[F05,F07]    2.099261e-01 4.716298e-02  4.451078015  8.544029e-06 F07 <-->
>> F05
>> C[F05,F08]    4.221026e-02 6.261302e-02  0.674145115  5.002191e-01 F08 <-->
>> F05
>> C[F05,F09]    3.165187e-02 7.721368e-01  0.040992561  9.673018e-01 F09 <-->
>> F05
>> C[F05,F11]    7.351754e-01 6.818771e-02 10.781639916  4.203245e-27 F11 <-->
>> F05
>> C[F07,F08]    3.180037e-03 4.670052e-02  0.068094253  9.457106e-01 F08 <-->
>> F07
>> C[F07,F09]    6.292195e-03 1.535561e-01  0.040976532  9.673146e-01 F09 <-->
>> F07
>> C[F07,F11]    1.049909e-01 4.942732e-02  2.124147077  3.365785e-02 F11 <-->
>> F07
>> C[F08,F09]    1.346105e-02 3.284233e-01  0.040986879  9.673064e-01 F09 <-->
>> F08
>> C[F08,F11]    1.383223e-01 6.694679e-02  2.066152656  3.881407e-02 F11 <-->
>> F08
>> C[F09,F11]    4.571695e-02 1.115233e+00  0.040993193  9.673013e-01 F11 <-->
>> F09
>> V[I01]        8.680184e-03 4.762484e-04 18.226169942  3.199593e-74 I01 <-->
>> I01
>> V[I02]        7.459398e-03 4.540213e-04 16.429621740  1.173889e-60 I02 <-->
>> I02
>> V[I03]        7.478254e-03 3.527242e-04 21.201419570 9.265904e-100 I03 <-->
>> I03
>> V[I04]        1.461376e-01 7.255861e-03 20.140635357  3.251385e-90 I04 <-->
>> I04
>> V[I05]        1.339123e-02 8.832859e-04 15.160696593  6.438285e-52 I05 <-->
>> I05
>> V[I06]        8.789764e-02 4.794460e-03 18.333167786  4.499223e-75 I06 <-->
>> I06
>> V[I07]        7.568474e-03 3.765280e-04 20.100692934  7.277043e-90 I07 <-->
>> I07
>> V[I08]        6.587699e-02 3.167671e-03 20.796666217  4.639577e-96 I08 <-->
>> I08
>> V[I09]        3.217338e-03 1.517789e-04 21.197527600  1.006468e-99 I09 <-->
>> I09
>> V[I10]        4.621928e-02 2.185030e-03 21.152695320  2.606174e-99 I10 <-->
>> I10
>> V[I11]        1.535621e-01 7.387455e-03 20.786870576  5.690287e-96 I11 <-->
>> I11
>> V[I12]        3.908344e-02 1.860301e-03 21.009196121  5.404186e-98 I12 <-->
>> I12
>> V[I13]        1.983328e-02 9.856998e-04 20.121018746  4.830497e-90 I13 <-->
>> I13
>> V[I14]        1.710572e-01 1.211810e-02 14.115839622  3.033809e-45 I14 <-->
>> I14
>> V[I15]        1.075179e-03 5.071602e-05 21.199985035 9.552682e-100 I15 <-->
>> I15
>> V[I16]        1.326202e-02 6.467196e-04 20.506601881  1.879773e-93 I16 <-->
>> I16
>> V[I17]        3.265749e-02 1.988078e-03 16.426667150  1.232493e-60 I17 <-->
>> I17
>> V[I18]        1.075154e-03 5.071579e-05 21.199589039 9.633394e-100 I18 <-->
>> I18
>> V[I19]        4.579942e-02 2.353962e-03 19.456315348  2.576564e-84 I19 <-->
>> I19
>> V[I20]        2.413742e-01 1.144346e-02 21.092761358  9.269013e-99 I20 <-->
>> I20
>> V[I21]        1.269773e-02 6.009212e-04 21.130448044  4.175664e-99 I21 <-->
>> I21
>> V[I22]        2.667065e-01 1.265916e-02 21.068268778  1.555139e-98 I22 <-->
>> I22
>> V[I23]        1.072933e-03 5.069564e-05 21.164210344  2.041534e-99 I23 <-->
>> I23
>> V[I24]        3.024220e-02 1.426452e-03 21.200993757 9.350120e-100 I24 <-->
>> I24
>> V[I25]        4.271005e-02 2.065984e-03 20.672986805  6.064466e-95 I25 <-->
>> I25
>> V[I26]        8.208471e-02 3.892796e-03 21.086314551  1.062215e-98 I26 <-->
>> I26
>> V[I27]        3.448443e-02 1.627464e-03 21.189053796  1.204944e-99 I27 <-->
>> I27
>> V[I28]        1.074072e-03 5.065613e-05 21.203199739 8.921947e-100 I28 <-->
>> I28
>> V[I29]        1.388601e-02 6.548663e-04 21.204342235 8.707941e-100 I29 <-->
>> I29
>> V[I30]        3.656256e-02 1.724532e-03 21.201435371 9.262794e-100 I30 <-->
>> I30
>> V[I31]        1.989840e-01 9.383562e-03 21.205594692 8.479218e-100 I31 <-->
>> I31
>> V[I32]        5.755557e-02 2.882318e-03 19.968499245  1.035172e-88 I32 <-->
>> I32
>> V[I33]        2.481455e-01 1.532786e-02 16.189179144  6.012530e-59 I33 <-->
>> I33
>> V[I34]        1.484183e-02 7.000026e-04 21.202534570 9.048952e-100 I34 <-->
>> I34
>> V[I35]        7.415580e-03 3.516263e-04 21.089380308  9.955712e-99 I35 <-->
>> I35
>> V[I36]        2.011634e-02 9.488573e-04 21.200591226 9.430434e-100 I36 <-->
>> I36
>> V[I37]        1.047757e-03 5.025784e-05 20.847625170  1.601775e-96 I37 <-->
>> I37
>> V[I38]        2.156861e-02 3.241426e-03  6.654050864  2.851341e-11 I38 <-->
>> I38
>> V[I39]        1.265785e-01 6.238795e-03 20.288931432  1.610577e-91 I39 <-->
>> I39
>> V[I40]        2.541968e-01 1.242997e-02 20.450322391  5.967951e-93 I40 <-->
>> I40
>> V[I41]        8.528364e-02 4.023849e-03 21.194542822  1.072350e-99 I41 <-->
>> I41
>> V[I42]        8.216499e-02 3.888144e-03 21.132187265  4.024656e-99 I42 <-->
>> I42
>> V[I43]        1.337408e-02 6.438437e-04 20.772251070  7.715629e-96 I43 <-->
>> I43
>> V[I46]        1.907454e-01 8.996895e-03 21.201249767 9.299396e-100 I46 <-->
>> I46
>> V[I47]        8.508783e-03 4.165525e-04 20.426677159  9.687421e-93 I47 <-->
>> I47
>> V[I48]        2.714640e-01 1.280461e-02 21.200497563 9.449220e-100 I48 <-->
>> I48
>> V[I49]        3.218862e-03 1.518230e-04 21.201415045 9.266795e-100 I49 <-->
>> I49
>> V[I50]        7.447779e-03 3.685477e-04 20.208454710  8.249036e-91 I50 <-->
>> I50
>> V[I51]        2.929982e-05 1.053218e-04  0.278193234  7.808640e-01 I51 <-->
>> I51
>> V[I54]        1.833931e-01 8.842196e-03 20.740673158  1.488283e-95 I54 <-->
>> I54
>> V[I55]        4.784306e-02 2.783744e-03 17.186584134  3.346789e-66 I55 <-->
>> I55
>> V[I56]        1.304849e-01 6.185550e-03 21.095115843  8.818929e-99 I56 <-->
>> I56
>> V[I57]        8.868251e-02 4.280267e-03 20.718917274  2.338858e-95 I57 <-->
>> I57
>> V[I58]        2.765876e-01 1.332324e-02 20.759777754  1.000282e-95 I58 <-->
>> I58
>> V[I59]        1.309969e-01 6.275841e-03 20.873197799  9.384143e-97 I59 <-->
>> I59
>> V[I60]        2.844711e-02 1.341830e-03 21.200226581 9.503782e-100 I60 <-->
>> I60
>> V[I61]        3.368300e-02 1.992102e-03 16.908270471  3.910162e-64 I61 <-->
>> I61
>> V[I62]        7.504898e-03 3.540020e-04 21.200154519 9.518345e-100 I62 <-->
>> I62
>> V[I63]        7.472838e-02 3.568523e-03 20.940981942  2.267379e-97 I63 <-->
>> I63
>> V[I64]        5.371193e-03 2.533508e-04 21.200616220 9.425427e-100 I64 <-->
>> I64
>> V[I65]       -1.558692e+01 7.736661e+02 -0.020146825  9.839262e-01 I65 <-->
>> I65
>> V[I66]        6.009302e-02 2.837570e-03 21.177638375  1.535393e-99 I66 <-->
>> I66
>> V[I67]        1.075013e-03 5.220505e-05 20.592119939  3.229259e-94 I67 <-->
>> I67
>> V[I69]        8.817859e-02 5.000004e-03 17.635704215  1.310532e-69 I69 <-->
>> I69
>> V[I70]        2.218392e-02 1.279170e-03 17.342438243  2.249872e-67 I70 <-->
>> I70
>> V[I71]        3.093500e-02 1.758727e-03 17.589432179  2.968370e-69 I71 <-->
>> I71
>>
>>  Iterations =  1000
>>
>> --------- snip ------------
>>
>> Several of the observed variables have R^2s that round to 0 and many more
>> are very small.
>>
>> I don't have your original data, but I did look at the input covariance
>> matrix. Here are the standard deviations of the observed variables:
>>
>> --------- snip ------------
>>
>> > sqrt(diag(cov.mat))
>>        I01        I02        I03        I04        I05        I06
>>  I07
>>
>> 0.09794939 0.09239769 0.08647698 0.40592964 0.14988296 0.34276336
>> 0.09257290
>>
>>        I08        I09        I10        I11        I12        I13
>>  I14
>>
>> 0.26288788 0.05673501 0.21562354 0.40159670 0.19999190 0.14969750
>> 0.48787040
>>
>>        I15        I16        I17        I18        I19        I20
>>  I21
>>
>> 0.03279129 0.11746460 0.20339207 0.03279129 0.22450179 0.49285671
>> 0.11291786
>>
>>        I22        I23        I24        I25        I26        I27
>>  I28
>>
>> 0.51844236 0.03279129 0.17390500 0.20982058 0.28746674 0.18587268
>> 0.03279129
>>
>>        I29        I30        I31        I32        I33        I34
>>  I35
>>
>> 0.11789736 0.19121352 0.44618622 0.24132578 0.50500808 0.12183229
>> 0.08647698
>>
>>        I36        I37        I38        I39        I40        I41
>>  I42
>>
>> 0.14183651 0.03279129 0.20705800 0.36721084 0.51768833 0.29210990
>> 0.28739426
>>
>>        I43        I45        I46        I47        I48        I49
>>  I50
>>
>> 0.11746460 0.13454976 0.43680464 0.09794939 0.52139099 0.05673501
>> 0.09239769
>>
>>        I51        I54        I55        I56        I57        I58
>>  I59
>>
>> 0.03279129 0.43984267 0.26013269 0.36354251 0.30622933 0.53958761
>> 0.36898429
>>
>>        I60        I61        I62        I63        I64        I65
>>  I66
>>
>> 0.16867489 0.22011795 0.08663745 0.27761032 0.07329198 0.52861343
>> 0.24514452
>>
>>        I67        I68        I69        I70        I71
>> 0.03279129 0.16616880 0.33665601 0.17020504 0.19965594
>>
>> --------- snip ------------
>>
>> Some of the standard deviations are very small, suggesting that the
>> corresponding variables must have been close to invariant in your data set.
>>
>> If you haven't already done so, I think that you might back up and look
>> more
>> closely at your data, and perhaps seek some competent local help.
>>
>> I hope that this helps,
>>  John
>>
>> -----------------------------------------------
>> John Fox
>> Senator McMaster Professor of Social Statistics
>> Department of Sociology
>> McMaster University
>> Hamilton, Ontario, Canada
>>
>>
>>
>> > -----Original Message-----
>> > From: r-help-bounces at r-project.org [mailto:r-help-bounces at r-project.org]
>> > On Behalf Of Ruijie
>> > Sent: Friday, February 08, 2013 9:56 PM
>> > To: R-help at stat.math.ethz.ch
>> > Subject: [R] Troubleshooting underidentification issues in structural
>> > equation modelling (SEM)
>> >
>> > Hi all, hope someone can help me out with this.
>> > Background Introduction
>> >
>> > I have a data set consisting of data collected from a questionnaire that
>> > I
>> > wish to validate. I have chosen to use confirmatory factor analysis to
>> > analyse this data set.
>> > Instrument
>> >
>> > The instrument consists of 11 subscales. There is a total of 68 items in
>> > the 11 subscales. Each item is scored on an integer scale between 1 to
>> > 4.
>> > Confirmatory factor analysis (CFA) setup
>> >
>> > I use the sem package to conduct the CFA. My code is as below:
>> >
>> > cov.mat <-
>> > as.matrix(read.table("http://dl.dropbox.com/u/1445171/cov.mat.csv",
>> > sep = ",", header = TRUE))
>> > rownames(cov.mat) <- colnames(cov.mat)
>> >
>> > model <- cfa(file = "http://dl.dropbox.com/u/1445171/cfa.model.txt",
>> > reference.indicators = FALSE)
>> > cfa.output <- sem(model, cov.mat, N = 900, maxiter = 80000, optimizer
>> > = optimizerOptim)
>> > Warning message:In eval(expr, envir, enclos) : Negative parameter
>> > variances.Model may be underidentified.
>> >
>> > Straight off you might notice a few anomalies, let me explain.
>> >
>> >    - Why is the optimizer chosen to be optimizerOptim?
>> >
>> > ANS: I originally stuck with the default optimizerSem but no matter how
>> > many iterations I run, either I run out of memory first (8GB RAM setup)
>> > or
>> > it would report no convergence Things "seemed" a little better when I
>> > switched to optimizerOptim where by it would conclude successfully but
>> > throws up the error that the model is underidentified. Upon closer
>> > inspection, I realise that the output shows convergence as TRUE but
>> > iterations is NA so I am not sure what is exactly happening.
>> >
>> >    - The maxiter is too high.
>> >
>> > ANS: If I set it to a lower value, it refuses to converge, although as
>> > mentioned above, I doubt real convergence actually occurred.
>> > Problem
>> >
>> > So by now I guess that the model is really underidentified so I looked
>> > for
>> > resources to resolve this problem and found:
>> >
>> >    - http://davidakenny.net/cm/identify_formal.htm
>> >    - http://faculty.ucr.edu/~hanneman/soc203b/lectures/identify.html
>> >
>> > I followed the 2nd link quite closely and applied the t-rule:
>> >
>> >    - I have 68 observed variables, providing me with 68 variances and
>> > 2278
>> >    covariances between variables = *2346 data points*.
>> >    - I also have 68 regression coefficients, 68 error variances of
>> >    variables, 11 factor variances and 55 factor covariances to estimate
>> > making
>> >    it a total of 191 parameters.
>> >    - Since I will be fixing the variances of the 11 latent factors to 1
>> > for
>> >    scaling, I would remove them from the parameters to estimate making
>> > it a
>> >    total of *180 parameters to estimate*.
>> >       - My degrees of freedom is therefore 2346 - 180 = 2166, making it
>> > an
>> >       over identified model by the t-rule.
>> >
>> > Questions
>> >
>> >    1. Is the low variance of some of my items a possible cause for the
>> >    underidentification? I was advised previously to remove items with
>> > zero
>> >    variance which led me to think about items which are very close to
>> > zero.
>> >    Should they be removed too?
>> >    2. After reading much, I think but am not sure that it might be a
>> > case
>> >    of empirical underidentification. Is there a systematic way of
>> > diagnosing
>> >    what kind of underidentification it is? And what are my options to
>> > proceed
>> >    with my analysis?
>> >
>> > I have more questions but let's take it at these 2 for now. Thanks for
>> > any
>> > help!
>> >
>> > Regards,
>> > Ruijie (RJ)
>> >
>> > --------
>> > He who has a why can endure any how.
>> >
>> > ~ Friedrich Nietzsche
>> >
>> >       [[alternative HTML version deleted]]
>> >
>> > ______________________________________________
>> > R-help at r-project.org mailing list
>> > https://stat.ethz.ch/mailman/listinfo/r-help
>> > PLEASE do read the posting guide http://www.R-project.org/posting-
>> > guide.html
>> > and provide commented, minimal, self-contained, reproducible code.
>>
>>
>
>         [[alternative HTML version deleted]]
>
> ______________________________________________
> R-help at r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.



-- 

Bert Gunter
Genentech Nonclinical Biostatistics

Internal Contact Info:
Phone: 467-7374
Website:
http://pharmadevelopment.roche.com/index/pdb/pdb-functional-groups/pdb-biostatistics/pdb-ncb-home.htm



More information about the R-help mailing list