[R] Mathematica now working with Nvidia GPUs --> any plan for R?
Prof Brian Ripley
ripley at stats.ox.ac.uk
Wed Nov 19 07:56:37 CET 2008
On Tue, 18 Nov 2008, Emmanuel Levy wrote:
> Dear All,
>
> I just read an announcement saying that Mathematica is launching a
> version working with Nvidia GPUs. It is claimed that it'd make it
> ~10-100x faster!
> http://www.physorg.com/news146247669.html
Well, lots of things are 'claimed' in marketing (and Wolfram is not shy to
claim). I think that you need lots of GPUs, as well as the right problem.
> I was wondering if you are aware of any development going into this
> direction with R?
It seems so, as users have asked about using CUDA in R packages.
Parallelization is not at all easy, but there is work on making R better
able to use multi-core CPUs, which are expected to become far more common
that tens of GPUs.
> Thanks for sharing your thoughts,
>
> Best wishes,
>
> Emmanuel
PS: R-devel is the list on which to discuss the development of R.
--
Brian D. Ripley, ripley at stats.ox.ac.uk
Professor of Applied Statistics, http://www.stats.ox.ac.uk/~ripley/
University of Oxford, Tel: +44 1865 272861 (self)
1 South Parks Road, +44 1865 272866 (PA)
Oxford OX1 3TG, UK Fax: +44 1865 272595
More information about the R-help
mailing list