[R] Non-Linear Regression (Cobb-Douglas and C.E.S)
Sundar Dorai-Raj
sundar.dorai-raj at PDF.COM
Sun Apr 18 13:55:22 CEST 2004
Mohammad Ehsanul Karim wrote:
> Dear Sundar Dorai-Raj,
>
> Thank you very much for mentioning to exponentiate ALPHA.
>
> However, so far i understand that the parameters in the non-linear equation
> Y = ALPHA * (L^(BETA1)) * (K^(BETA2))
> and the coefficients of log(L) and log(K) of the following equation
> (after linearizing)
> log(Y) = log(ALPHA) +(BETA1)*log(L) + (BETA2)*log(K)
> should be the same when estimated from either equation. Is it true? If
> it is, then why the estimates of the two procedure (see below) are
> different? Can you please explain it?
> -----------------------------
> > coef(lm(log(Y)~log(L)+log(K), data=klein.data))
>
> (Intercept) log(L) log(K)
> -3.6529493 1.0376775 0.7187662
> -----------------------------
> > nls(Y~ALPHA * (L^(BETA1)) * (K^(BETA2)), data=klein.data, start =
> c(ALPHA=exp(-3.6529493),BETA1=1.0376775,BETA2 = 0.7187662), trace = TRUE)
>
> Nonlinear regression model
> model: Y ~ ALPHA * (L^(BETA1)) * (K^(BETA2))
> data: klein.data
> ALPHA BETA1 BETA2
> 0.003120991 0.414100040 1.513546235
> residual sum-of-squares: 3128.245
> -----------------------------
>
Not necessarily. In the first model, you're minimizing:
sum((log(Y) - log(Yhat))^2)
because the nonlinear model you're fitting is:
Y = ALPHA * L^BETA1 * K^BETA2 * ERROR
log(Y) = log(ALPHA) + BETA1 * log(L) + BETA2 * log(K) + log(ERROR)
Note the multiplicative error structure. In the second model you're
mininmizing
sum((Y - Yhat)^2)
because the nonlinear model you're fitting is
Y = ALPHA * L^BETA1 * K^BETA2 + ERROR
Note the additive error structure. Different error structures, different
parameter estimates.
Also, the residual sums of squares for the nls fit is smaller, although
I'm not sure whether this comparison is really fair:
klein.lm <- lm(log(Y) ~ log(L) + log(K))
# `start' is not shown here but can be copied from above
klein.nls <- nls(Y ~ ALPHA * L^BETA1 * K^BETA2, data = klein.data,
start = start, trace = TRUE)
rss.lm <- sum((Y - exp(fitted(klein.lm)))^2) # 3861.147
rss.nls <- sum((Y - fitted(klein.nls))^2) # 3128.245
Now, which one do you use? Depends on whether you believe you have
multiplicative errors (use lm) or additive errors (use nls).
--sundar
More information about the R-help
mailing list