[R-sig-Geo] optimizing SGSim for large data sets
William Savran
wsavran at gmail.com
Thu Oct 13 23:16:31 CEST 2016
Hi R-Sig-Geo,
I am working on a project related to deterministic probabilistic seismic
hazard analysis (PSHA). I know this sounds quite like an oxymoron, but the
idea is to use deterministic simulations of seismic wave propagation (due
to a dearth of data at large magnitudes) to help make statistical
inferences about seismic hazard in a particular area.
This project involves simulating several dozen up to several thousand large
grids (< 1 million degrees of freedom) using SGSim that represent an
earthquake source. Right now, I am using the R gstat package for the
simulation and it works great for a 'prototype' of the model, but I believe
the performance is too slow for a 'production' version of the model where
the simulation of several thousand grids becomes a reality.
Does anyone have any benchmarking data comparing the SGSim implementation
in R to an implementation in pure C or Fortran? Or a C or Fortran code that
performs the simulation for me to run and share the benchmarking results?
Cheers,
Bill
[[alternative HTML version deleted]]
More information about the R-sig-Geo
mailing list