[R] Linear Model with curve fitting parameter?
stephen sefick
ssefick at gmail.com
Tue Apr 5 01:50:34 CEST 2011
Thank you very much for all of your help.
On Mon, Apr 4, 2011 at 6:10 PM, Steven McKinney <smckinney at bccrc.ca> wrote:
>
>
>> -----Original Message-----
>> From: stephen sefick [mailto:ssefick at gmail.com]
>> Sent: April-04-11 2:49 PM
>> To: Steven McKinney
>> Subject: Re: [R] Linear Model with curve fitting parameter?
>>
>> Steven:
>>
>> I am really sorry for my confusion. I hope this now makes sense.
>>
>> b0 == y intercept == y-intercept == (intercept) fit by lm
>>
>> a <- 1:10
>> b <- 1:10
>>
>> summary(lm(a~b))
>> #to show what I was calling b0
>>
>> So...
>>
>> ################################################
>> manning
>>
>> Q = K*A*(R^b2)*(S^b3)
>>
>> log(Q) = log(K)+log(A)+(b2*log(R))+(b3*log(S))
>
> Okay, using this notation, this appears to be the original
> model you queried about. So for this model, as I showed
> before,
>
> Let Z = log(Q) - log(A)
>
> E(Z) = b0 + b2*log(R) + b3*log(S)
> = log(K) + b2*log(R) + b3*log(S)
>
> Fitting the model lm(Z ~ log(R) + log(S))
> will yield parameter estimates b_hat_0, b_hat_2, b_hat_3
> where
> b_hat_0 (the fitted model intercept) is an estimate of b0 (which is log(K)),
> b_hat_2 is an estimate of b2,
> b_hat_3 is an estimate of b3.
>
> So in answer to your previous question, b0 is an
> estimate of log(K), not ( log(Qintercept)+log(K) )
> so an estimate for K is exp(b_hat_0)
>
>
>>
>> ################################################
>> dingman
>> Q = K*(A^b1)*(R^b2)*(S^b3*log(S))
>>
>> log(Q) = log(K)+(b1*log(A))+(b2*log(R))+(b3*(log(S))^2)
>
> The dingman model notation is ambiguous. Is the last
> term S^(b3*log(S)) or (S^b3)*log(S) ?
>
> Previous email showed
>
> > dingman
> > log(Q)=log(b0)+log(K)+a*log(A)+r*log(R)+s*(log(S))^2
>
> which implies (if I ignore the log(b0) term)
> Q = K*(A^a)*(R^r)*(exp(log(S)*log(S))^s)
> = K*(A^a)*(R^r)*(S^(log(S)*s))
>
> This is linearizable as
>
> log(Q) = log(K) + a*log(A) + r*log(R) + s*(log(S))^2
> = b0 + b1*log(A) + b2*log(R) + b3*(log(S)^2)
>
> Fitting lm(log(Q) ~ log(A) + log(R) + I(log(S)^2) ... )
> will yield estimates b_hat_0, b_hat_1, b_hat_2 and b_hat_3
> where b_hat_0 is an estimate of b0 = log(K) so an estimate of K is exp(b_hat_0),
> b_hat_1 is an estimate of b1 = a,
> b_hat_2 is an estimate of b2 = r,
> b_hat_3 is an estimate of b3 = s
>
>
>
>>
>> ################################################
>>
>> Bjerklie
>>
>> Q = K*(A^b1)*(R^b2)*(S^b3)
>>
>> log(Q) = log(K)+(b1*log(A))+(b2*log(R))*(b3*log(S))
>
> Fitting lm(log(Q) ~ log(A) + log(R) + log(S) ... )
> will yield estimates b_hat_0, b_hat_1, b_hat_2 and b_hat_3
> where b_hat_0 is an estimate of b0 = log(K) so an estimate of K is exp(b_hat_0),
> b_hat_1 is an estimate of b1 = a,
> b_hat_2 is an estimate of b2 = r,
> b_hat_3 is an estimate of b3 = s
>
>
> Best
>
> Steve McKinney
>
>>
>> ################################################
>>
>>
>>
>>
>>
>> On Mon, Apr 4, 2011 at 2:58 PM, Steven McKinney <smckinney at bccrc.ca> wrote:
>> >
>> >> -----Original Message-----
>> >> From: stephen sefick [mailto:ssefick at gmail.com]
>> >> Sent: April-03-11 5:35 PM
>> >> To: Steven McKinney
>> >> Cc: R help
>> >> Subject: Re: [R] Linear Model with curve fitting parameter?
>> >>
>> >> Steven:
>> >>
>> >> You are exactly right sorry I was confused.
>> >>
>> >>
>> >> #######################################################
>> >> so log(y-intercept)+log(K) is a constant called b0 (is this right?)
>> >
>> > Doesn't look right to me based on the information you've provided.
>> > I don't see anything labeled "y" in your previous emails, so I'm
>> > not clear on what y is and how it relates to the original model
>> > you described
>> >
>> > > >> I have a model Q=K*A*(R^r)*(S^s)
>> > > >>
>> > > >> A, R, and S are data I have and K is a curve fitting parameter.
>> >
>> > If the model is
>> >
>> > Q=K*A*(R^r)*(S^s)
>> >
>> > then
>> >
>> > log(Q) = log(K) + log(A) + r*log(R) + s*log(S)
>> >
>> > Rearranging yields
>> >
>> > log(Q) - log(A) = log(K) + r*log(R) + s*log(S)
>> >
>> > Let Z = log(Q) - log(A) = log(Q/A)
>> >
>> > so
>> >
>> > Z = log(K) + r*log(R) + s*log(S)
>> >
>> > and a linear model fit of
>> >
>> > Z ~ log(R) + log(S)
>> >
>> > will yield parameter estimates for the linear equation
>> >
>> > E(Z) = B0 + B1*log(R) + B2*log(S)
>> >
>> > (E(Z) = expected value of Z)
>> >
>> > so B0 estimate is an estimate of log(K)
>> > B1 estimate is an estimate of r
>> > B2 estimate is an estimate of s
>> >
>> > and these are the only parameters you described in the original model.
>> >
>> >
>> >>
>> >> lm(log(Q)~log(A)+log(R)+log(S)-1)
>> >>
>> >> is fitting the model
>> >>
>> >> log(Q)=a*log(A)+r*log(R)+s*log(S) (no beta 0)
>> >>
>> >> and
>> >>
>> >> lm(log(Q)~log(A)+log(R)+log(S))
>> >>
>> >>
>> >> is fitting the model
>> >>
>> >> log(Q)=b0+a*log(A)+r*log(R)+s*log(S)
>> >
>> > K has disappeared from these equations so these model fits do
>> > not correspond to the model originally described. Now a b0
>> > appears, and is used in models below. I think changing notation
>> > is also adding confusion. What are "y" and "intercept" you
>> > discuss above, in relation to your original notation?
>> >
>> >>
>> >> ######################################################
>> >>
>> >> These are the models I am trying to fit and if I have reasoned
>> >> correctly above then I should be able to fit the below models
>> >> similarly.
>> >
>> > You will be able to fit models appropriately once you have a
>> > clearly defined system of notation that allows you to map between
>> > the proposed data model, the parameters in that model, and the
>> > corresponding regression equations.
>> >
>> > Once you have consistent notation, you will be able to see
>> > if you can express your model as a linear regression, or
>> > if not, what kind of non-linear regression you will need to
>> > do to get estimates for the parameters in your model.
>> >
>> > Best
>> >
>> > Steve McKinney
>> >
>> >>
>> >> manning
>> >> log(Q)=log(b0)+log(K)+log(A)+r*log(R)+s*log(S)
>> >>
>> >> dingman
>> >> log(Q)=log(b0)+log(K)+a*log(A)+r*log(R)+s*(log(S))^2
>> >>
>> >> bjerklie
>> >> log(Q)=log(b0)+log(K)+a*log(A)+r*log(R)+s*log(S)
>> >>
>> >> #######################################################
>> >>
>> >> Thank you for all of your help!
>> >>
>> >> Stephen
>> >>
>> >> On Fri, Apr 1, 2011 at 2:44 PM, Steven McKinney <smckinney at bccrc.ca> wrote:
>> >> >
>> >> >> -----Original Message-----
>> >> >> From: stephen sefick [mailto:ssefick at gmail.com]
>> >> >> Sent: April-01-11 5:44 AM
>> >> >> To: Steven McKinney
>> >> >> Cc: R help
>> >> >> Subject: Re: [R] Linear Model with curve fitting parameter?
>> >> >>
>> >> >> Setting Z=Q-A would be the incorrect dimensions. I could Z=Q/A.
>> >> >
>> >> > I suspect this is confusion about what Q is. I was presuming that
>> >> > the Q in this following formula was log(Q) with Q from the original data.
>> >> >
>> >> >> >> I have taken the log of the data that I have and this is the model
>> >> >> >> formula without the K part
>> >> >> >>
>> >> >> >> lm(Q~offset(A)+R+S, data=x)
>> >> >
>> >> > If the model is
>> >> >
>> >> > Q=K*A*(R^r)*(S^s)
>> >> >
>> >> > then
>> >> >
>> >> > log(Q) = log(K) + log(A) + r*log(R) + s*log(S)
>> >> >
>> >> > Rearranging yields
>> >> >
>> >> > log(Q) - log(A) = log(K) + r*log(R) + s*log(S)
>> >> >
>> >> > so what I labeled 'Z' below is
>> >> >
>> >> > Z = log(Q) - log(A) = log(Q/A)
>> >> >
>> >> > so
>> >> >
>> >> > Z = log(K) + r*log(R) + s*log(S)
>> >> >
>> >> > and a linear model fit of
>> >> >
>> >> > Z ~ log(R) + log(S)
>> >> >
>> >> > will yield parameter estimates for the linear equation
>> >> >
>> >> > E(Z) = B0 + B1*log(R) + B2*log(S)
>> >> >
>> >> > (E(Z) = expected value of Z)
>> >> >
>> >> > so B0 estimate is an estimate of log(K)
>> >> > B1 estimate is an estimate of r
>> >> > B2 estimate is an estimate of s
>> >> >
>> >> > More details and careful notation will eventually lead
>> >> > to a reasonable description and analysis strategy.
>> >> >
>> >> >
>> >> > Best
>> >> >
>> >> > Steve McKinney
>> >> >
>> >> >
>> >> >
>> >> >> Is fitting a nls model the same as fitting an ols? These data are
>> >> >> hydraulic data from ~47 sites. To access predictive ability I am
>> >> >> removing one site fitting a new model and then accessing the fit with
>> >> >> a myriad of model assessment criteria. I should get the same answer
>> >> >> with ols vs nls? Thank you for all of your help.
>> >> >>
>> >> >> Stephen
>> >> >>
>> >> >> On Thu, Mar 31, 2011 at 8:34 PM, Steven McKinney <smckinney at bccrc.ca> wrote:
>> >> >> >
>> >> >> >> -----Original Message-----
>> >> >> >> From: r-help-bounces at r-project.org [mailto:r-help-bounces at r-project.org] On Behalf Of stephen
>> >> >> sefick
>> >> >> >> Sent: March-31-11 3:38 PM
>> >> >> >> To: R help
>> >> >> >> Subject: [R] Linear Model with curve fitting parameter?
>> >> >> >>
>> >> >> >> I have a model Q=K*A*(R^r)*(S^s)
>> >> >> >>
>> >> >> >> A, R, and S are data I have and K is a curve fitting parameter. I
>> >> >> >> have linearized as
>> >> >> >>
>> >> >> >> log(Q)=log(K)+log(A)+r*log(R)+s*log(S)
>> >> >> >>
>> >> >> >> I have taken the log of the data that I have and this is the model
>> >> >> >> formula without the K part
>> >> >> >>
>> >> >> >> lm(Q~offset(A)+R+S, data=x)
>> >> >> >>
>> >> >> >> What is the formula that I should use?
>> >> >> >
>> >> >> > Let Z = Q - A for your logged data.
>> >> >> >
>> >> >> > Fitting lm(Z ~ R + S, data = x) should yield
>> >> >> > intercept parameter estimate = estimate for log(K)
>> >> >> > R coefficient parameter estimate = estimate for r
>> >> >> > S coefficient parameter estimate = estimate for s
>> >> >> >
>> >> >> >
>> >> >> >
>> >> >> > Steven McKinney
>> >> >> >
>> >> >> > Statistician
>> >> >> > Molecular Oncology and Breast Cancer Program
>> >> >> > British Columbia Cancer Research Centre
>> >> >> >
>> >> >> >
>> >> >> >
>> >> >> >>
>> >> >> >> Thanks for all of your help. I can provide a subset of data if necessary.
>> >> >> >>
>> >> >> >>
>> >> >> >>
>> >> >> >> --
>> >> >> >> Stephen Sefick
>> >> >> >> ____________________________________
>> >> >> >> | Auburn University |
>> >> >> >> | Biological Sciences |
>> >> >> >> | 331 Funchess Hall |
>> >> >> >> | Auburn, Alabama |
>> >> >> >> | 36849 |
>> >> >> >> |___________________________________|
>> >> >> >> | sas0025 at auburn.edu |
>> >> >> >> | http://www.auburn.edu/~sas0025 |
>> >> >> >> |___________________________________|
>> >> >> >>
>> >> >> >> Let's not spend our time and resources thinking about things that are
>> >> >> >> so little or so large that all they really do for us is puff us up and
>> >> >> >> make us feel like gods. We are mammals, and have not exhausted the
>> >> >> >> annoying little problems of being mammals.
>> >> >> >>
>> >> >> >> -K. Mullis
>> >> >> >>
>> >> >> >> "A big computer, a complex algorithm and a long time does not equal science."
>> >> >> >>
>> >> >> >> -Robert Gentleman
>> >> >> >> ______________________________________________
>> >> >> >> R-help at r-project.org mailing list
>> >> >> >> https://stat.ethz.ch/mailman/listinfo/r-help
>> >> >> >> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
>> >> >> >> and provide commented, minimal, self-contained, reproducible code.
>> >> >> >
>> >> >>
>> >> >>
>> >> >>
>> >> >> --
>> >> >> Stephen Sefick
>> >> >> ____________________________________
>> >> >> | Auburn University |
>> >> >> | Biological Sciences |
>> >> >> | 331 Funchess Hall |
>> >> >> | Auburn, Alabama |
>> >> >> | 36849 |
>> >> >> |___________________________________|
>> >> >> | sas0025 at auburn.edu |
>> >> >> | http://www.auburn.edu/~sas0025 |
>> >> >> |___________________________________|
>> >> >>
>> >> >> Let's not spend our time and resources thinking about things that are
>> >> >> so little or so large that all they really do for us is puff us up and
>> >> >> make us feel like gods. We are mammals, and have not exhausted the
>> >> >> annoying little problems of being mammals.
>> >> >>
>> >> >> -K. Mullis
>> >> >>
>> >> >> "A big computer, a complex algorithm and a long time does not equal science."
>> >> >>
>> >> >> -Robert Gentleman
>> >> >
>> >>
>> >>
>> >>
>> >> --
>> >> Stephen Sefick
>> >> ____________________________________
>> >> | Auburn University |
>> >> | Biological Sciences |
>> >> | 331 Funchess Hall |
>> >> | Auburn, Alabama |
>> >> | 36849 |
>> >> |___________________________________|
>> >> | sas0025 at auburn.edu |
>> >> | http://www.auburn.edu/~sas0025 |
>> >> |___________________________________|
>> >>
>> >> Let's not spend our time and resources thinking about things that are
>> >> so little or so large that all they really do for us is puff us up and
>> >> make us feel like gods. We are mammals, and have not exhausted the
>> >> annoying little problems of being mammals.
>> >>
>> >> -K. Mullis
>> >>
>> >> "A big computer, a complex algorithm and a long time does not equal science."
>> >>
>> >> -Robert Gentleman
>> >
>>
>>
>>
>> --
>> Stephen Sefick
>> ____________________________________
>> | Auburn University |
>> | Biological Sciences |
>> | 331 Funchess Hall |
>> | Auburn, Alabama |
>> | 36849 |
>> |___________________________________|
>> | sas0025 at auburn.edu |
>> | http://www.auburn.edu/~sas0025 |
>> |___________________________________|
>>
>> Let's not spend our time and resources thinking about things that are
>> so little or so large that all they really do for us is puff us up and
>> make us feel like gods. We are mammals, and have not exhausted the
>> annoying little problems of being mammals.
>>
>> -K. Mullis
>>
>> "A big computer, a complex algorithm and a long time does not equal science."
>>
>> -Robert Gentleman
>
--
Stephen Sefick
____________________________________
| Auburn University |
| Biological Sciences |
| 331 Funchess Hall |
| Auburn, Alabama |
| 36849 |
|___________________________________|
| sas0025 at auburn.edu |
| http://www.auburn.edu/~sas0025 |
|___________________________________|
Let's not spend our time and resources thinking about things that are
so little or so large that all they really do for us is puff us up and
make us feel like gods. We are mammals, and have not exhausted the
annoying little problems of being mammals.
-K. Mullis
"A big computer, a complex algorithm and a long time does not equal science."
-Robert Gentleman
More information about the R-help
mailing list