[R-SIG-Finance] Exploratory analyses: Experience using gputools-package for Nvidia graphics-accellerators?
Brian G. Peterson
brian at braverock.com
Fri Oct 16 20:29:32 CEST 2009
Gero Schwenk wrote:
> Hi, dear finance modelers!
> I want to employ some systematic approach to explore promising
> predictors for trading models. Therefore I think about experimenting
> with parallel-processing on GPU's (Nvidia graphics accellerators) -
> these are quite cheap (150-300€) and contain usually more than 240
> processing units.
>
> There is a package called "gputools" (
> http://cran.r-project.org/web/packages/gputools/index.html ) which
> seems to originate from the bioinformatics-community and implements
> very interesting functionality for exploratory analysis and large
> scale predictive modeling. Among these are calculation of distance
> metrics for clustering, tests for granger causality, approximation of
> the mutual information, calculation of correlation coefficients,
> estimating and predicting support vector machines and support vector
> regression.
>
> Now my question: Does anybody have experience using this package or
> GPU- resp. parallel-processing for exploration? Or do you use other
> environments, resp. approaches?
This is my personal experience and thoughts only, and not as
well-informed as I might like, ymmv.
I know firms in finance that are making extensive use of different GPU
architectures. They are *all* doing a lot of low level C programming to
do it, using the API directly in many cases, or reference
implementations of linear and matrix algebra packages tuned for the GPU
they've chosen. I appreciate the approach if you have the resources to
engage in it.
My personal feeling is that the "general purpose" in "general purpose
GPU" will not be met until the linear algebra libraries that are hidden
from most users transparently support execution on GPU's. See for
example the MAGMA project, run by the folks that brought us the widely
deployed ATLAS.
After experimenting with some of the tools that are available now, I
made the decision here at my work to not do anything serious with GPU's
right now. I expect to revisit that decision again in a few months, as
the machines at my desk already have reasonably powerful GPU hardware in
them. However, right now, the potential hasn't gotten to the level
where it makes it worth the work for what I do.
I think that over time, commonly available math libraries and
parallelization frameworks will embrace GPU's, and *then* I'll have more
reason to spend time working with them.
Cheers,
- Brian
--
Brian G. Peterson
http://braverock.com/brian/
Ph: 773-459-4973
IM: bgpbraverock
More information about the R-SIG-Finance
mailing list