[R] Memory usage problem while using nlm function
Kushal
kushal.shah at arisglobal.com
Wed Dec 31 12:58:40 CET 2014
Hi,
I am trying to do nonlinear minimization using nlm() function, but for
large amount of data it is going out of memory.
Code which i am using:
f<-function(p,n11,E){
sum(-log((p[5] * dnbinom(n11, size=p[1], prob=p[2]/(p[2]+E)) +
(1-p[5]) * dnbinom(n11, size=p[3], prob=p[4]/(p[4]+E)))))
}
p_out <-nlm(f, p=c(alpha1= 0.2, beta1= 0.06, alpha2=1.4, beta2=1.8, w=0.1),
n11=n11_c, E=E_c)
When the size of n11_c or E_c vector is to large, it is going out of memory.
please give me some solution for this.
--
View this message in context: http://r.789695.n4.nabble.com/Memory-usage-problem-while-using-nlm-function-tp4701241.html
Sent from the R help mailing list archive at Nabble.com.
More information about the R-help
mailing list