[Rd] Max likelihood using GPU
ral64 at cam.ac.uk
Wed May 18 14:27:15 CEST 2011
I believe this is possible to implement. There is already some work ongoing in using the GPU in R and they use the CUDA toolkit as the reference you supplied do.
On 18 May 2011, at 10:07, oyvfos wrote:
> Dear all,
> Probably many of you experience long computation times when estimating large
> number of parameters using maximum likelihood with functions that reguire
> numerical methods such as integration or root-finding. Maximum likelihood is
> an example of paralellization that could sucessfully utilize GPU. The
> general algorithm is described here:
> Is it possible to implement this algorithm in R ?
> Kind regards, Oyvind Foshaug
> View this message in context: http://r.789695.n4.nabble.com/Max-likelihood-using-GPU-tp3532034p3532034.html
> Sent from the R devel mailing list archive at Nabble.com.
> R-devel at r-project.org mailing list
More information about the R-devel