# [R] Regression inclusion of variable, effect on coefficients

Thiemo Fetzer tf at devmag.net
Mon Apr 21 21:06:03 CEST 2008

```Hello :)

I am happy to hear that I am not necessarily asking stupid questions.

The thing is, that I have data on x1 and x4 for the whole sample. However,
theoretically, it is clear that the informational content of x1 is not as
high as of x4. x4 provides more accurate information to the subjects
participating in the game, as it has been experimentally and theoretically
shown that the x1 is biased.

So the experimentators introduced x4 in response to the biased x1. Both
prevail however together, so that the subjects have available information on
x1 and x4.

Theoretically, I argued that the "relative importance" of x1 on y will
decrease in light that information x4 is available, as x4 is more accurate.

With a simple regression, however, I do not find significant relationships.
For x1 it has been empirically and theoretically shown that it has a
positive effect on y. The same should hold for x4.

There is no necessary theoretical argument as how x1 and x4 interact
mathematically, as they both are a measure of the same thing. Yet, x4 is
more accurate and contains even more information.  It could be any kind of
interaction. They are positively correlated, which is also reasonable.

Could you suggest me a simple interaction model, with which I could try my
luck?

Thanks a lot

Thiemo

-----Original Message-----
From: Uwe Ligges [mailto:ligges at statistik.tu-dortmund.de]
Sent: Montag, 21. April 2008 18:54
To: Thiemo Fetzer
Cc: r-help at r-project.org
Subject: Re: [R] Regression inclusion of variable, effect on coefficients

This is not a dump question. This is a serious problem and it depends on
what you know or assume about the relastionship between x1 and x4. If
you assume linear interaction, you might want to introduce some
interaction term to the model for example.

Uwe Ligges

Thiemo Fetzer wrote:
> Hello dear R users!
>
> I know this question is not strictly R-help, yet, maybe some of the guru's
> in statistics can help me out.
>
>
>
> I have a sample of data all from the same "population". Say my regression
> equation is now this:
>
>
>
> m1 <- lm(y ~ x1 + x2 + x3)
>
>
>
> I also regress on
>
>
>
> m2 <- lm(y ~ x1 + x2 + x3 + x4)
>
>
>
> The thing is, that I want to study the effect of "information" x4.
>
>
>
> I would hypothesize, that the coefficient estimate for x1 goes down as I
> introduce x4, as x4 conveys some of the information conveyed by x1 (but
not
> only). Of course x1 and x4 are correlated, however multicollinearity does
> not appear to be a problem, the variance inflation factors are rather low
> (around 1.5 or so).
>
>
>
> I want to basically study, how the interplay between x1 and x4 is, when
> introducing x4 into the regression equation and whether my hypothesis is
> correct; i.e. that given I consider the information x4, not so much of the
> variation is explained via x1 anymore.
>
>
>
> I observe that introducing x4 into the regression, the coefficient
estimate
> for x1 goes down; also the associated p-value becomes bigger; i.e. x1
> becomes comparatively less significant. However, x4 is not significant.
Yet,
> the observation is in line with my theoretical argument.
>
>
>
> The question is now simple: how can I work this out?
>
>
>
> I know this is likely a dumb question, but I would really appreciate some
>
>
> Regards
>
> Thiemo
>
>
> 	[[alternative HTML version deleted]]
>
> ______________________________________________
> R-help at r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help