[Rd] Max likelihood using GPU

oyvfos oyvfos at yahoo.no
Wed May 18 11:07:23 CEST 2011


Dear all,
Probably many of you experience long computation times when estimating large
number of parameters using maximum likelihood  with functions that reguire
numerical methods such as integration or root-finding. Maximum likelihood is
an example of paralellization that could sucessfully utilize GPU. The
general algorithm is described here:
http://openlab-mu-internal.web.cern.ch/openlab-mu-internal/03_Documents/4_Presentations/Slides/2010-list/CHEP-Maximum-likelihood-fits-on-GPUs.pdf.
Is it possible to implement this algorithm in R ? 
Kind regards, Oyvind Foshaug

--
View this message in context: http://r.789695.n4.nabble.com/Max-likelihood-using-GPU-tp3532034p3532034.html
Sent from the R devel mailing list archive at Nabble.com.



More information about the R-devel mailing list