[R] Fitting data with optim or nls--different time scales

Leslie Chavez leslou at ctbp.ucsd.edu
Tue Aug 8 20:22:06 CEST 2006


I have a system of ODE's I can solve with lsoda.

    #parameter definitions
    lambda=parms[1]; beta=parms[2]; 
    d = parms[3]; delta = parms[4]; 
     p=parms[5];    c=parms[6]
      xdot[1] = lambda - (d*x[1])- (beta*x[3]*x[1])
      xdot[2] = (beta*x[3]*x[1]) - (delta*x[2])
      xdot[3] = (p*x[2]) - (c*x[3])

I want to fit the output out[,4] to experimental data that is only 
available on days 0, 7, 12, 14, 17, and 20. I don't know how to set up 
optim or nls so that it takes out[,4] on the appropriate day, but still 
runs lsoda on a time scale of 0.01 day.

Below is the function I've been using to run 'optim', at the 
course-grained time scale:

Modelfit=function(s) {
	parms[1:4]=s[1:4]; times=c(0,7,12,14,17,20,25)
#	cat(times)
#parms(lambda, beta, d, delta, p, c)

s0=c(49994,8456,6.16E-8,0.012) #initial values


Right now, lsoda is being run on too course-grained a time scale in the 
function Modelfit. Most examples of optim and nls I have found compare 
two data sets at the same times, and run lsoda on the time scale the 
data is available at, but I would like to run lsoda at a finer scale, and 
only compare the appropriate time points with the experiment.  I have also 
tried using nls, but I have the same problem. Does anyone have 

Thank you very much,


More information about the R-help mailing list