[R-sig-dyn-mod] time dependent forcing of model parameters in compiled code

Andras Farkas motyocska at yahoo.com
Mon Apr 17 13:05:51 CEST 2017


Dear All,


managed to cut down my previous post a little bit which should make my goal more clear...

we have:


#R code

require(deSolve) 
require(FME) 
intimes <- c(0,0.5,12,12.5,24) 
input  <- c(800,0,1500,0,0) 
forc <- approxfun(intimes, input, method="constant", rule=2) 
observed <- matrix (nc=2,byrow=2,data=c(3.3,14,23.6,3)) 
colnames(observed) <- c("time", "cp.a") 
tout <- unique(sort(c(intimes, observed[,"time"]))) 
obj <- function(x, parset = names(x)) { 
pars[parset] <- x 
out <- model(pars, tout) 
return(modCost(obs = observed, model = out)) 
} 

t<-seq(0,25,0.1) 
MAXT<-max(t) 
pars <- c(k1 = 0.08,k2=0.2,v1=25,v2=25,MAXT=MAXT) 
inp<-data.frame(t,forc(t)) 
forcings<-inp 

setwd("  ") 
system("R CMD SHLIB modelSIM_forcedtest.c") 
dyn.load(paste("modelSIM_forcedtest", .Platform$dynlib.ext, sep = "")) 
#dyn.unload(paste("modelSIM_forcedtest", .Platform$dynlib.ext, sep = "")) 


model <- function(pars, 
times=seq(0,max(t), by = 0.1)) { 

state <- c(a=0) 
return(ode(y = state, 
times = times, 
func = "derivs", 
parms = pars, 
dllname = "modelSIM_forcedtest", 
initforc = "forcc", 
forcings=forcings, 
method="lsoda", 
rtol = 1e-8, 
atol =1e-8, 
hmax=0.1, 
initfunc = "initmod", 
nout=1, 
outnames=c("cp.a"))) 
} 

plot(model(pars)) 
final <- modMCMC(p = c(k1=0.08,k2=0.16,v1=25,v2=25), f = obj, prior=NULL,lower=c(0.0001,0.00001,0.0001,0.0001),jump = NULL, niter = 1000,updatecov=10, wvar0 = 1,var0=NULL,burninlength = 10) 
pars2<-c(final$bestpar,MAXT) 
plot(model(pars),model(pars2)) 
points(observed) 


#C code as I have it now:

/* file modelSIM.c */ 

#include <R.h> 

static double parms[5]; 
static double forc[1]; 
#define k1 parms[0] 
#define k2 parms[1] 
#define v1 parms[2] 
#define v2 parms[3] 
#define MAXT parms[4] 
#define import1 forc[0] 


/* initializer */ 

void initmod(void (* odeparms)(int *, double *)) 

{ 
int N=5; 

odeparms(&N, parms); 

} 

void forcc(void (* odeforcs)(int *, double *)) 
{ 
int N=1; 
odeforcs(&N, forc); 
} 

/* Derivatives and 1 output variable */ 

void derivs (int *neq, double *t, double *y, double *ydot, 

double *yout, int *ip) 

{ double k,v; 

if ( fmod(*t,MAXT) < 12 ){
k =  k1;
}
else {
k  =  k2;
} 
if ( fmod(*t,MAXT) < 12 ){
v =  v1;
}
else {
v  =  v2;
} 

if (ip[0] <1) error("nout should be at least 1"); 

ydot[0]= import1 -k* y[0]; 
yout[0]=y[0]/v; 

} 


/* END file modelSIM.c */



The if else statement in the C code will allow for a time dependent change of the model parameters which eventually allows for a "better" identification of the bestpar parameters, especially in more complex models... My question is if you know of a working example that may have used forcing functions to implement time dependent input of parameters in C (aware of the approxfun that can be done in R)? While this does get the job done, the goal is to have a set of codes that would allow the use of a forcing function (if possible) so if the parameters need to change more then one times over time then the C model code would not need to be re-written again...

much appreciate your help,



Andras Farkas



More information about the R-sig-dynamic-models mailing list