[R] understanding nlm

Gabor Grothendieck ggrothendieck at myway.com
Fri Jun 25 16:02:36 CEST 2004

Some things to try are:

1. the nls function
2. replacing p with 1/p

Steven Lacey <slacey <at> umich.edu> writes:

: Hi, 
: I am using the nlm() function to fit the following exponential function to
: some data by minimizing squared differences between predicted and observed
: values:
: csexponential<- function(x, t1, ti, p){
:     ti + abs(t1 - ti)*(exp(-(p*(x-1))))
: }
: As background, the data is performance measured across time. As you might
: imagine, we get rapid improvement across the first couple of time points and
: then the improvement becomes more gradual. In psychology this is known as
: the power law of practice. 
: For some cases the learning is so rapid that the function appears to have a
: notch (imagine an "L" shaped function). In these cases the parameter
: estimate for the power, p, is large (typically p>15). 
: I have repeated the fitting procedure on the same set of data and "appear"
: to have found that with the same starting values and arguments to nlm I get
: somewhat different values for p in those cases of extremely rapid learning
: described above. For example, one time p=21 and another p=23. The relative
: change is not huge, but I would like the parameter estimates to be stable
: across replications with the same data/settings.
: It is certainly possible that I changed something in the code inadvertantly
: and that is why I am observing these discrepancies. However, it is also
: possible that there may be some random decision making within nlm. So that
: if nlm finds a space where the fits are equally good, it may return slightly
: different values each time it is run because it lands in a different
: location.
: I suspect my later interpretation is false because the estimated values do
: not change after each and every replication of the analysis. But I thought I
: might ask. 
: Thanks for any insight you can provide, 
: Steve Lacey

More information about the R-help mailing list