[R] Why terms are dropping out of an lm() model

John Pitney john at pitney.org
Fri Aug 27 00:17:24 CEST 2004


>
>
> John Pitney wrote:
>> Hi all!
>>
>> I'm fairly new to R and not too experienced with regression.  Because
>> of one or both of those traits, I'm not seeing why some terms are being
>> dropped from my model when doing a regression using lm().
>>
>> I am trying to do a regression on some experimental data d, which has
>> two numeric predictors, p1 and p2, and one numeric response, r.  The aim
>> is to compare polynomial models in p1 and p2 up to third order.  I don't
>> understand why lm() doesn't return coefficients for the p1^3 and p2^3
>> terms.  Similar loss of terms happened when I tried orthonormal
>> polynomials to third order.
>>
>> I'm satisfied with the second-order regression, by the way, but I'd
>> still like to understand why the third-order regression doesn't work
>> like I'd expect.
>>
>> Can anyone offer a pointer to help me understand this?
>>
>> Here's what I'm seeing in R 1.9.1 for Windows.  Note the NA's for p1^3
>> and p2^3 in the last summary.
>>
>> [stuff deleted]
>>
>> -0.089823 -0.017707  0.001952  0.020820  0.059302
>>
>> Coefficients: (2 not defined because of singularities)
>
> Did you miss reading the above line? Seems you supplied a singular model
> to `lm' and since the default for `lm' is `singular.ok = TRUE,' it just
> pivoted these columns out in the QR-decomposition.

Yes, I missed that line.  The model matrix is indeed singular.

Thanks for the quick and helpful response, and sorry for posting before
thinking carefully enough!

John




More information about the R-help mailing list