[Rd] Master's project to coerce linux nvidia drivers to run generalised linear models
Marc Schwartz (via MN)
mschwartz at mn.rr.com
Mon Jan 23 22:47:38 CET 2006
On Mon, 2006-01-23 at 15:24 -0500, Oliver LYTTELTON wrote:
>
> Hi,
>
> I am working with a friend on a master's project. Our laboratory does a
> lot of statistical analysis using the R stats package and we also have a
> lot of under-utilised nvidia cards sitting in the back of our networked
> linux machines. Our idea is to coerce the linux nvidia driver to run
> some of our statistical analysis for us. Our first thought was to
> specifically code up a version of glm() to run on the nvidia cards...
>
> Thinking that this might be of use to the broader community we thought
> we might ask for feedback before starting?
>
> Any ideas...
>
> Thanks,
>
> Olly
Well, I'll bite.
My first reaction to this was, why?
Then I did some Googling and found the following article:
http://www.apcmag.com/apc/v3.nsf/0/5F125BA4653309A3CA25705A0005AD27
And also noted the GPU Gems 2 site here:
http://developer.nvidia.com/object/gpu_gems_2_home.html
So, my new found perspective is, why not?
Best wishes for success, especially since I have a certain affinity for
McGill...
HTH,
Marc Schwartz
More information about the R-devel
mailing list