[R-SIG-Finance] Exploratory analyses: Experience using gputools-package for Nvidia graphics-accellerators?
Dirk Eddelbuettel
edd at debian.org
Fri Oct 16 20:39:06 CEST 2009
On 16 October 2009 at 20:10, Gero Schwenk wrote:
| Hi, dear finance modelers!
| I want to employ some systematic approach to explore promising
| predictors for trading models. Therefore I think about experimenting
| with parallel-processing on GPU's (Nvidia graphics accellerators) -
| these are quite cheap (150-300€) and contain usually more than 240
| processing units.
|
| There is a package called "gputools" (
| http://cran.r-project.org/web/packages/gputools/index.html ) which seems
| to originate from the bioinformatics-community and implements very
| interesting functionality for exploratory analysis and large scale
| predictive modeling. Among these are calculation of distance metrics for
| clustering, tests for granger causality, approximation of the mutual
| information, calculation of correlation coefficients, estimating and
| predicting support vector machines and support vector regression.
|
| Now my question: Does anybody have experience using this package or GPU-
| resp. parallel-processing for exploration? Or do you use other
| environments, resp. approaches?
For what it is worth I started looking into this two days ago when I took
possession of such an NVidia card (with a list price considerably above EUR
300 for its 192 cores) and I also started with the (nice) gputools package as
a starting point.
However, I'd say that this belongs onto r-sig-hpc. "Just because" you (and
even I) would like to use it in Finance doesn't make it Finance. It is still
a methodological question somewhat orthogonal to what we do here, and more at
home on the High-Performance Computing list.
Again, those disagreeing with me are kindly invited to let me know off-list
if there was in fact a consensus to having such a discussion here.
Dirk
--
Three out of two people have difficulties with fractions.
More information about the R-SIG-Finance
mailing list