[R-sig-ME] lmer() for conjoint analysis? (interpreting coefficients)

Marianne Promberger marianne.promberger at kcl.ac.uk
Thu Aug 5 19:00:30 CEST 2010


Dear list,

I have data from a discrete choice experiment (conjoint analysis). I'm
using lmer(... family=binomial) to analyse the data. I'm posting here
in case there is something I overlooked that makes this analysis
inappropriate. I also have two specific questions.

Our experiment aims to assess subjects' relative preferences between
standard medication and three different alternatives for smoking
cessation, compared to increases in treatment effectiveness.

My main specific questions are:

- Is my interpretation (below) of the coeffcients of the fixed effects
  correct? (I did read Douglas Bates' answer to this post yesterday,
  https://stat.ethz.ch/pipermail/r-sig-mixed-models/2010q3/004232.html
  but I am not sure whether I am making that mistake of paying too
  much attention to a particular coefficient)

- How worried need I be about high correlation of fixed effects in my
  second model?

Here's the setup of the study:

Each of 98 subjects made 9 choices, choosing one alternative each
time from pairs of two.

We had prior evidence that subjects would prefer the standard
treatment to any of the alternatives, at equal effectiveness. Hence,
to reduce number of pairs to present to each subject, one option in
each pair was always standard medication at lowest level of
effectiveness (10 out of 100), and the other option was one of the
three alternatives, at equal or better effectiveness: 10 out of 100,
20 out of 100, 40 out of 100. 

str(long)
'data.frame':	882 obs. of  5 variables:
 $ subject      : Factor w/ 98 levels "subject 001",..: 1 2 3 4 5 6 7 8 9 10 ...
 $ alternative  : Factor w/ 3 levels "alt1","alt2",..: 1 1 1 1 1 1 1 1 1 1 ...
 $ effectiveness: Factor w/ 3 levels "10","20","40": 1 1 1 1 1 1 1 1 1 1 ...
 $ choice       : Factor w/ 2 levels "0","1": 1 2 2 1 1 1 1 2 1 2 ...


I fit this model: (model 1)
lmer(choice ~ 0 + alternative + effectiveness + (1|subject), family = binomial, data = long)

                Estimate Std. Error z value Pr(>|z|)    
alternativealt1   -0.363      0.446   -0.81     0.42    
alternativealt2   -0.679      0.448   -1.51     0.13    
alternativealt3    2.422      0.459    5.28  1.3e-07 ***
effectiveness20    2.543      0.321    7.93  2.3e-15 ***
effectiveness40    3.846      0.376   10.22  < 2e-16 ***

The output makes sense and matches the story from a simple barplot
showing percentage choosing standard vs alternative at each level of
effectiveness. (attached - bars are unequal height because I have set
intransitive preferences within an alternative to NA)

Of interest in conjoint analysis are the relative preferences, or
"part-worth utilities", and to my understanding I can get these by
comparing coefficients, e.g. increasing effectiveness from 10 to 40
(3.846) is about 1.5 times as important as increasing effectiveness
from 20 to 40 (revealed in choice behaviour in that more subjects
choose the alternative). Alternatives 1 and 2 are not significant
because standard and alternative get chosen about equally often, but
they can be compared in that alternative 3 is preferred to medication,
and that preference is, e.g., 2.42/.67= 3.6 times stronger than the
slight preference of medication over alternative 2. 

(Ideally, I will do bootstrapping to get empirical confidence
intervals around these coefficients and hence the ratios)

We also asked each subject once about their perceptions of
responsibility for smoking, and had a hypothesis that high perceptions
would lead to rejection of the alternative treatment.

 $ respcause    : int  4 5 6 6 5 5 6 5 4 5 ...

I include the term like this: (OR, do I need (1+respcause|subject)?

model 2:
lmer(choice ~ 0 + alternative + effectiveness + respcause + (1|subject), family = binomial, data = long)

alternativealt1    2.881      1.511    1.91  0.05655 .  
alternativealt2    2.572      1.514    1.70  0.08926 .  
alternativealt3    5.671      1.529    3.71  0.00021 ***
effectiveness20    2.550      0.322    7.92  2.3e-15 ***
effectiveness40    3.856      0.377   10.22  < 2e-16 ***
respcause         -0.753      0.335   -2.25  0.02470 *  

The effect of "respcause" is in the predicted direction (less likely
to choose alternative). Can I compare the coefficient of this effect,
which is at the subject level, to the other coefficients?

The ratios of some coefficients are roughly the same as in model 1,
e.g. between effectiveness20 and effectiveness40, but they change
quite dramatically, e.g., when comparing effectiveness at 40 to
alternative 3. 

Does this mean it is inappropriate to interpret the coefficients in
this way? Or is the appropriate interpretation that knowing how a
subject attributes responsibility (respcause) explains some of their
choice behaviour, and when this knowledge is taken into account, the
relative importance of the other factors changes? The latter
interpretation would make sense in our experiment, as we would have
the hypothesis that the assignment of responsibility makes subjects
prefer standard medication to these specific alternatives, but that
the influence of effectiveness operates on a different level, i.e. the
variance that gets picked up by respcause was previously picked up by
the alternatives, but changing effectiveness adds 'independent'
variance.

Is this making sense?

Maybe related is the correlation of the fixed effects for model 2,
which is very high for the alternatives and respcause, respectively --
I guess if this were a different type of study I'd have to worry that
the model is overparametrized, but in this case this is part of the
message -- or do I have to worry about this after all?

Correlation of Fixed Effects:
            altrn1 altrn2 altrn3 effc20 effc40
alterntvlt2  0.982                            
alterntvlt3  0.971  0.970                     
effctvnss20 -0.051 -0.058  0.014              
effctvnss40 -0.039 -0.047  0.040  0.559       
respcause   -0.956 -0.956 -0.955 -0.048 -0.059


I hope I'm not totally off. 

Thanks

Marianne











-- 
Marianne Promberger PhD, King's College London
http://promberger.info
R version 2.11.1 (2010-05-31)
Ubuntu 9.10
-------------- next part --------------
A non-text attachment was scrubbed...
Name: choose.png
Type: image/png
Size: 6636 bytes
Desc: not available
URL: <https://stat.ethz.ch/pipermail/r-sig-mixed-models/attachments/20100805/ca95987b/attachment.png>


More information about the R-sig-mixed-models mailing list