[R] weighted regression
rex_bryan@urscorp.com
rexbryan1 at comcast.net
Fri Dec 19 20:07:34 CET 2003
Tom
You are right. I goofed on my x and y's ... sorry about that.
This example came from a MathCAD discussion group which
a math wizard with handle Paul_W proposed a solution. I then tried
Statistica to see if a "big and professional" statistics package could
do regression with weights. Yes and no. Statistica's idea
of weighting appears to be one of "replicating" the data by
a count number called "weight". Hence if you start with n =3 and you
weigh them each by 2 all the subsequent reports on number of samples will be
6. I don't know if this is standard in the statistical industry but boy it
did'nt meet the inverse variace idea at all. Yep, the MathCAD solution
and R seem to match perfectly. Now I trying to figure out how confidence and
prediction curves
work with weighted regression.
Thanks for the response.
Merry Christmas
REX.
----- Original Message -----
From: "Thomas W Blackwell" <tblackw at umich.edu>
To: "rex_bryan at urscorp.com" <rexbryan1 at comcast.net>
Cc: <r-help at stat.math.ethz.ch>
Sent: Friday, December 19, 2003 8:54 AM
Subject: Re: [R] weighted regression
> Rex -
>
> Yes, you have supplied an appropriate 'weight' argument
> given the problem description in the paragraph which
> begins 'Assume that ...'.
>
> Your example would be much easier to read if the variable
> names 'x' and 'y' in the R code matched their usage in the
> paragraph description, rather than transposing. But the
> usage within the R code is consistent, although counter-
> intuitive. Your example tries to predict the values of
> an almost constant vector c(6.7,6.7,6.6) from a highly
> varying one, c(1,6,11). No surprise that the intercept
> with the vertical axis is a bit larger than 6.7 and the
> slope is completely non-significant.
>
> - tom blackwell - u michigan medical school - ann arbor =
>
> On Thu, 18 Dec 2003, rex_bryan at urscorp.com wrote:
>
> > To all
> >
> > I have some simple questions pertaining to weights used in regression.
> > If the variability of the dependent variable (y) is a function of the
magnitude of predictor
> > variable (x), can the use of weights give an appropriate answer to the
regression parameters
> > and the std errors?
> >
> > Assume that y at x=1 and 6 has a standard deviation of 0.1 and at x=11
it is 0.4
> > Then according to a web page on weighted regression for a calibration
curve at
> > http://member.nifty.ne.jp/mniwa/rev006.htm, I should use 1/(std^2) for
each weight.
> >
> > i.e. for x=1 and 6, w = 100 and x=11, w = 6.25
> >
> > In R the run is:
> >
> > >y<-c(1,6,11)
> > >x<-c(6.7,6.7,6.6)
> > >w<-c(100,100,6.25)
> > >reg <-lm(x~y, weight=w)
> > > summary(reg)
> >
> > Call:
> > lm(formula = x ~ y, weights = w)
> >
> > Residuals:
> > 1 2 3
> > -0.04762 0.09524 -0.19048
> >
> > Coefficients:
> > Estimate Std. Error t value Pr(>|t|)
> > (Intercept) 6.707619 0.025431 263.762 0.00241 **
> > y -0.002857 0.005471 -0.522 0.69361
> > ---
> > Signif. codes: 0 `***' 0.001 `**' 0.01 `*' 0.05 `.' 0.1 ` ' 1
> >
> > Residual standard error: 0.2182 on 1 degrees of freedom
> > Multiple R-Squared: 0.2143, Adjusted R-squared: -0.5714
> > F-statistic: 0.2727 on 1 and 1 DF, p-value: 0.6936
> >
> > Am I using the weight method correctly?
> > And if so does the Estimated Std. Error for the Intercept and slope make
sense?
> >
> > On another note. How does one do a regression with the origin fixed at
0?
> >
> > Merry Christmas
> >
> > REX
> >
> >
> >
> >
> >
> > [[alternative HTML version deleted]]
> >
> > ______________________________________________
> > R-help at stat.math.ethz.ch mailing list
> > https://www.stat.math.ethz.ch/mailman/listinfo/r-help
> >
>
More information about the R-help
mailing list