[R] Confidence Intervals on Standard Curve

Ben Ward benjamin.ward at bathspa.org
Sat Feb 19 15:39:02 CET 2011


I've just realised the couple of graphs I put on here have been stripped 
off. If anyone has to see them and can't see my problem from code, I can 
send them directly to anyone who thinks they can help but wants to see them.

Thanks,
Ben W.

On 18/02/2011 23:29, Ben Ward wrote:
> Hi, I wonder if anyone could advise me with this:
>
> I've been trying to make a standard curve in R with lm() of some 
> standards from a spectrophotometer, so as I can express the curve as a 
> formula, and so obtain values from my treated samples by plugging in 
> readings into the formula, instead of trying to judge things by eye, 
> with a curve drawn by hand.
>
> It is a curve and so I used the following formula:
>
> model <- lm(Approximate.Counts~X..Light.Transmission + 
> I(Approximate.Counts^2), data=Standards)
>
> It gives me a pretty decent graph:
> xyplot(Approximate.Counts + fitted(model) ~ X..Light.Transmission, 
> data=Standards)
>
> I'm pretty happy with it, and looking at the model summary, to my 
> inexperienced eyes it seems pretty good:
>
> lm(formula = Approximate.Counts ~ X..Light.Transmission + 
> I(Approximate.Counts^2),
>     data = Standards)
>
> Residuals:
>    Min     1Q Median     3Q    Max
> -91.75 -51.04  27.33  37.28  49.72
>
> Coefficients:
>                           Estimate Std. Error t value Pr(>|t|)
> (Intercept)              9.868e+02  2.614e+01   37.75 <2e-16 ***
> X..Light.Transmission   -1.539e+01  8.116e-01  -18.96 <2e-16 ***
> I(Approximate.Counts^2)  2.580e-04  6.182e-06   41.73 <2e-16 ***
> ---
> Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
>
> Residual standard error: 48.06 on 37 degrees of freedom
> Multiple R-squared: 0.9956,    Adjusted R-squared: 0.9954
> F-statistic:  4190 on 2 and 37 DF,  p-value: < 2.2e-16
>
> I tried to put some 95% confidence interval lines on a plot, as 
> advised by my tutor, to see how they looked, and I used a function I 
> found in "The R Book":
>
> se.lines <- function(model){
> b1<-coef(model)[2]+ summary(model)[[4]][4]
> b2<-coef(model)[2]- summary(model)[[4]][4]
> xm<-mean(model[[12]][2])
> ym<-mean(model[[12]][1])
> a1<-ym-b1*xm
> a2<-ym-b2*xm
> abline(a1,b1,lty=2)
> abline(a2,b2,lty=2)
> }
> se.lines(model)
>
> but when I do this on a plot I get an odd result:
>
>
> They looks to me, to lie in the same kind of area, that my regression 
> line did, before I used polynomial regression, by squaring 
> "Approximate.Counts":
>
> lm(formula = Approximate.Counts ~ X..Light.Transmission + 
> I(Approximate.Counts^2), data = Standards)
>
> Is there something else I should be doing? I've seen several ways of 
> dealing with non-linear relationships, from log's of certain 
> variables, and quadratic regression, and using sin and other 
> mathematical devices. I'm not completely sure if I'm "allowed" to 
> square the y variable, the book only squared the x variable in 
> quadratic regression, which I did first, and it fit quite well, but 
> not as good squaring Approximate Counts does:
>
> model <- lm(Approximate.Counts~X..Light.Transmission + 
> I(X..Light.Transmission^2), data=Standards)
>
>
> Any advice is greatly appreciated, it's the first time I've really had 
> to look at regression with data in my coursework that isn't a straight 
> line.
>
> Thanks,
> Ben Ward.
> ______________________________________________
> R-help at r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide 
> http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
>
>



More information about the R-help mailing list